We are looking for few experienced Data Engineers and Integration Specialists and GCP/DevOps Architects for full time engagment with both onsite (Dubai/Doha/Oman) and remote (India/Pakistan) roles available. The candidates will be responsible for building a new data warehouse from the ground up, handling CDC (Change Data Capture) processes, and working on multiple data-related tasks, including migration, orchestration, and analytics. Preferences will be given for immediate joiner Responsibilities:Data Warehouse & CDC Development:Build a data warehouse from scratch, including landing zone setup.Work on CDC processes with a focus on efficiency and timeliness.Build data marts from landing zone, ready for consumption.Azure SQL Data Warehouse / Azure Synapse Review:Review existing schemas, ETL processes, data flows, and usage patterns.Optimize existing data structures for performance and scalability.Data Migration & Cloud Setup (GCP):Provision a data warehouse using BigQuery or Google Cloud SQL.Mirror schema and data models from Azure to BigQuery.Migrate historical data using BigQuery Data Transfer Service, Google Cloud Storage, or custom ETL pipelines.Apache Airflow/Composer Setup:Set up composer environments for Apache Airflow jobs.Configure and update DAGs repository in Cloud Source Repo.Modify DAGs code for GCP environment compatibility.Set up services for Cloud Composer orchestration.GCP & Dataproc Integration:Configure Dataproc cluster in Spark jobs (for 72 sources).Modify Spark jobs based on GCP environment specifications.Write and configure Spark jobs for 71 sources provided by the Azure team.Configure these sources in Apache Airflow DAGs for scheduling and monitoring.ETL Pipeline Replication:Replicate custom logic, business rules, and transformations from Azure to GCP.Reporting & Analytics Setup:Import existing data from Azure SQL Server pools.Build data warehousing layers in BigQuery (landing, staging, testing, production).Build data marts for consumption in Google Cloud SQL Server.Create reporting dashboards with monitoring capabilities in Looker, allowing seamless data querying from GCP.Requirements:Experience: Minimum of 5+ years of experience in Data Engineering, GCP, and Integration.Skills:Experience with BigQuery, Google Cloud SQL, Apache Airflow, and Dataproc.Proficient in Spark, ETL pipelines, and Azure Synapse.Strong understanding of data migration, data warehousing, and reporting dashboards.Experience in building data infrastructure and handling ETL processes.Looker experience for building reporting dashboards is a plus.Certifications: GCP certifications preferred.Locations:Onsite: Dubai, Doha, OmanRemote: India, Pakistan