Senior Data Engineer - GCP Native Platform
Fully Remote Long-Term Contract C2C / W2 / 1099 Min 12 Years Any Visa
Role Summary
Seeking a Senior Data Engineer with strong hands-on experience across the GCP Native Platform to build scalable data pipelines and deliver enterprise-grade data solutions.
Key Responsibilities
Build batch/streaming pipelines using Dataflow (Apache Beam).
Develop and manage Airflow DAGs in Cloud Composer.
Implement ELT pipelines using Dataform.
Design optimized BigQuery datasets (schemas partitioning materialized views).
Implement event-driven ingestion using Pub/Sub and manage data lakes on GCS.
Ensure pipeline performance data quality monitoring and production support.
Required Experience
5 years Data Engineering experience including 2 years on GCP.
Strong experience with BigQuery Dataflow Cloud Composer Pub/Sub GCS.
Proficiency in SQL & Python and building production-grade pipelines.
Git and CI/CD experience.
Additional/Preferred Skills
Dataform or dbt
Terraform basics
Data modeling and data quality frameworks
Cloud Functions Dataproc real-time streaming
GCP Professional Data Engineer certification (preferred)
Required Skills:
GITPYTHONCI/CDBIGQUERY
Senior Data Engineer - GCP Native PlatformFully Remote Long-Term Contract C2C / W2 / 1099 Min 12 Years Any VisaRole SummarySeeking a Senior Data Engineer with strong hands-on experience across the GCP Native Platform to build scalable data pipelines and deliver enterprise-grade data solutions.Ke...
Senior Data Engineer - GCP Native Platform
Fully Remote Long-Term Contract C2C / W2 / 1099 Min 12 Years Any Visa
Role Summary
Seeking a Senior Data Engineer with strong hands-on experience across the GCP Native Platform to build scalable data pipelines and deliver enterprise-grade data solutions.
Key Responsibilities
Build batch/streaming pipelines using Dataflow (Apache Beam).
Develop and manage Airflow DAGs in Cloud Composer.
Implement ELT pipelines using Dataform.
Design optimized BigQuery datasets (schemas partitioning materialized views).
Implement event-driven ingestion using Pub/Sub and manage data lakes on GCS.
Ensure pipeline performance data quality monitoring and production support.
Required Experience
5 years Data Engineering experience including 2 years on GCP.
Strong experience with BigQuery Dataflow Cloud Composer Pub/Sub GCS.
Proficiency in SQL & Python and building production-grade pipelines.
Git and CI/CD experience.
Additional/Preferred Skills
Dataform or dbt
Terraform basics
Data modeling and data quality frameworks
Cloud Functions Dataproc real-time streaming
GCP Professional Data Engineer certification (preferred)
Required Skills:
GITPYTHONCI/CDBIGQUERY
View more
View less