GCP Data Engineer
Job Summary
4 7 years of experience in data engineering or ETL development with large-
scale data systems.
Strong proficiency in Google Cloud Platform (BigQuery Dataflow Pub/Sub
Dataproc Cloud Storage Composer).
Solid understanding of ETL/ELT design principles data modeling and data
warehouse architecture.
Proficiency in Python SQL and experience with distributed data processing
frameworks (Apache Beam Spark).
Experience building data lakes and managing schema evolution in
structured and semi-structured data (JSON Parquet Avro).
Skilled in workflow orchestration (Airflow Dagster or Prefect) and
infrastructure automation (Terraform Cloud Deployment Manager).
Familiarity with API integrations event streaming and real-time data
processing.
Google Professional Data Engineer Certification or equivalent.
Experience with modern data stack tools like DBT Looker or Dataform.
Familiarity with machine learning data pipelines or feature engineering
workflows.
scale data systems.
Strong proficiency in Google Cloud Platform (BigQuery Dataflow Pub/Sub
Dataproc Cloud Storage Composer).
Solid understanding of ETL/ELT design principles data modeling and data
warehouse architecture.
Proficiency in Python SQL and experience with distributed data processing
frameworks (Apache Beam Spark).
Experience building data lakes and managing schema evolution in
structured and semi-structured data (JSON Parquet Avro).
Skilled in workflow orchestration (Airflow Dagster or Prefect) and
infrastructure automation (Terraform Cloud Deployment Manager).
Familiarity with API integrations event streaming and real-time data
processing.
Google Professional Data Engineer Certification or equivalent.
Experience with modern data stack tools like DBT Looker or Dataform.
Familiarity with machine learning data pipelines or feature engineering
workflows.