Procore Technologies logo
Data Architect – Data Lakehouse Transformation

Procore Technologies

Job Type

Full-time

Work Type

On-Site

Location

Jeddah, Saudi Arabia

Experience

12 - 24 years

Role Overview:

We are looking for a highly skilled Data Architect with a deep understanding of modern data architectures to support a large-scale Data Warehouse to Data Lakehouse Transformation initiative for a leading banking client. The ideal candidate will have a strong background in data platform architecture, solution design, and implementation, with expertise in Cloudera, Teradata, and Informatica, and a solid understanding of banking data domains.

This role will play a pivotal part in designing scalable, secure, and high-performance data solutions that align with the bank’s enterprise data strategy.

Key Responsibilities:

  • Design and define the end-to-end architecture for the Data Lakehouse solution covering Bronze, Silver, and Gold layers, metadata management, and data governance.
  • Lead data platform modernization initiatives involving migration from legacy DWH to modern Cloudera-based architecture.
  • Translate business and functional requirements into scalable data architecture solutions.
  • Collaborate with engineering, platform, analytics, and business teams to define data flows, ingestion strategies, transformation logic, and consumption patterns.
  • Ensure architectural alignment with enterprise data standards, security guidelines, and regulatory requirements.
  • Define data modeling standards and oversee data modeling efforts across layers (relational and big data).
  • Partner with the implementation oversight partner to review and validate logical and physical data models.
  • Drive architecture reviews, performance tuning, and capacity planning for the data ecosystem.
  • Guide and mentor data engineering teams on architectural best practices.

Required Skills and Experience:

  • 12+ years of experience in data architecture, data platform design, or enterprise architecture roles.
  • Strong hands-on experience in Cloudera (Hadoop ecosystem, Hive, HDFS, Spark), Teradata, Informatica PowerCenter/IDQ, and SQL-based platforms.
  • Deep understanding of data ingestion, curation, transformation, and consumption in both batch and near real-time.
  • Banking industry experience with familiarity across domains such as retail, corporate banking, credit risk, finance, and regulatory reporting.
  • Proficiency in designing for scalability, performance optimization, and data security/compliance.
  • Solid experience with data lakehouse concepts, open table formats (Iceberg/Delta), and layered architectures.
  • Experience integrating BI/reporting platforms (e.g., Power BI, Cognos) and downstream data products.

Preferred Attributes:

  • Experience with Kafka/NiFi for streaming ingestion and orchestration tools like Control-M or Airflow.
  • Knowledge of metadata, lineage, and data catalog tools.
  • Familiarity with hybrid deployment models (on-prem and cloud) and DevOps/DataOps pipelines.
  • TOGAF, CDMP, or DAMA certification is a plus.