About the Role
We are seeking a skilled GCP Data Engineer to design develop and maintain scalable data pipelines and cloud-based solutions on Google Cloud Platform (GCP). The ideal candidate will have strong experience in data engineering cloud technologies and building highperformance data ecosystems that support analytics reporting and machine learning initiatives.
Key Responsibilities
- Design build and optimize ETL/ELT pipelines using GCP native tools such as Dataflow Dataproc Composer and BigQuery.
- Develop scalable data architectures and ensure efficient data ingestion from multiple sources.
- Build and manage data warehouses data lakes and data marts on GCP.
- Implement CI/CD pipelines for data workflows using tools like Cloud Build GitLab or Jenkins.
- Work closely with data analysts data scientists and business stakeholders to meet data needs.
- Monitor troubleshoot and enhance performance of data pipelines.
- Ensure compliance with data quality security and governance standards.
- Automate repetitive processes using Python SQL and GCP automation tools.
Required Skills & Qualifications
- Bachelors degree in Computer Science Engineering or related field.
- 8 years of experience in Data Engineering (based on seniority).
- Strong expertise in GCP services:
- BigQuery
- Cloud Storage
- Dataflow / Apache Beam
- Dataproc / Spark
- Pub/Sub
- Cloud Composer (Airflow)
- Strong proficiency in Python and SQL.
- Experience with ETL/ELT pipeline development and distributed data processing.
- Knowledge of data modeling schema design and database optimization.
- Experience with CI/CD Git Docker or Terraform for IaC.
a text-decoration: none; color: #464feb;tr th tr td border: 1px solid #e6e6e6;tr th background-color: #f5f5f5;About the Role We are seeking a skilled GCP Data Engineer to design develop and maintain scalable data pipelines and cloud-based solutions on Google Cloud Platform (GCP). The ideal candid...
About the Role
We are seeking a skilled GCP Data Engineer to design develop and maintain scalable data pipelines and cloud-based solutions on Google Cloud Platform (GCP). The ideal candidate will have strong experience in data engineering cloud technologies and building highperformance data ecosystems that support analytics reporting and machine learning initiatives.
Key Responsibilities
- Design build and optimize ETL/ELT pipelines using GCP native tools such as Dataflow Dataproc Composer and BigQuery.
- Develop scalable data architectures and ensure efficient data ingestion from multiple sources.
- Build and manage data warehouses data lakes and data marts on GCP.
- Implement CI/CD pipelines for data workflows using tools like Cloud Build GitLab or Jenkins.
- Work closely with data analysts data scientists and business stakeholders to meet data needs.
- Monitor troubleshoot and enhance performance of data pipelines.
- Ensure compliance with data quality security and governance standards.
- Automate repetitive processes using Python SQL and GCP automation tools.
Required Skills & Qualifications
- Bachelors degree in Computer Science Engineering or related field.
- 8 years of experience in Data Engineering (based on seniority).
- Strong expertise in GCP services:
- BigQuery
- Cloud Storage
- Dataflow / Apache Beam
- Dataproc / Spark
- Pub/Sub
- Cloud Composer (Airflow)
- Strong proficiency in Python and SQL.
- Experience with ETL/ELT pipeline development and distributed data processing.
- Knowledge of data modeling schema design and database optimization.
- Experience with CI/CD Git Docker or Terraform for IaC.
View more
View less