GCP Data Engineer
Austin, TX - USA
Job Summary
Job Title: GCP Data Engineer
Location: Austin TX (Hybrid)
Domain: Retail / E-Commerce
Duration: 12 Months Contract
Only W2 No C2C
Location: Austin TX (Hybrid)
Domain: Retail / E-Commerce
Duration: 12 Months Contract
Only W2 No C2C
Job Description:
We are looking for an experienced GCP Data Engineer to design and build scalable data pipelines and modern data platforms on Google Cloud Platform (GCP). The candidate will support enterprise analytics and data-driven initiatives.
We are looking for an experienced GCP Data Engineer to design and build scalable data pipelines and modern data platforms on Google Cloud Platform (GCP). The candidate will support enterprise analytics and data-driven initiatives.
Key Responsibilities:
Design and develop data pipelines and ETL workflows using GCP services.
Build scalable data processing systems using Dataflow and Dataproc.
Develop and manage BigQuery data warehouses.
Implement batch and streaming data pipelines.
Work with data analysts and data scientists to enable advanced analytics and reporting.
Ensure data quality security and governance standards.
Optimize data pipeline performance and cost efficiency
Design and develop data pipelines and ETL workflows using GCP services.
Build scalable data processing systems using Dataflow and Dataproc.
Develop and manage BigQuery data warehouses.
Implement batch and streaming data pipelines.
Work with data analysts and data scientists to enable advanced analytics and reporting.
Ensure data quality security and governance standards.
Optimize data pipeline performance and cost efficiency
Required Skills:
8 years of experience in Data Engineering / Big Data
Strong experience with Google Cloud Platform (GCP)
Hands-on experience with BigQuery Dataflow Dataproc Cloud Storage
Experience with Apache Spark Kafka
Strong programming skills in Python Java or Scala
Experience building ETL / ELT pipelines
Strong knowledge of SQL and data modeling
8 years of experience in Data Engineering / Big Data
Strong experience with Google Cloud Platform (GCP)
Hands-on experience with BigQuery Dataflow Dataproc Cloud Storage
Experience with Apache Spark Kafka
Strong programming skills in Python Java or Scala
Experience building ETL / ELT pipelines
Strong knowledge of SQL and data modeling
Preferred Skills:
Experience with Cloud Composer (Airflow)
Experience with Docker / Kubernetes
Experience with data lakes and modern data architecture
Experience with Cloud Composer (Airflow)
Experience with Docker / Kubernetes
Experience with data lakes and modern data architecture
Best Regards
Tarun K
Phone: 1-
Email:
Tarun K
Phone: 1-
Email: