GCP Data Engineer BigQuery, Dataflow, Composer

Not Interested
Bookmark
Report This Job

profile Job Location:

Kitchener - Canada

profile Monthly Salary: Not Disclosed
profile Experience Required: 7years
Posted on: 6 hours ago
Vacancies: 1 Vacancy

Job Summary

GCP Data Engineer (BigQuery Dataflow Composer)

Role Description:

  • Design and build scalable secure and high-performance data pipelines on GCP.
  • Develop and optimize ETL/ELT workflows using Cloud Composer Dataflow Dataproc and BigQuery.
  • Implement data ingestion frameworks for batch and streaming data (Pub/Sub Kafka Dataflow).
  • Model partition and optimize datasets in BigQuery for analytics use cases.
  • Collaborate with data scientists architects and business teams to deliver end-to-end data solutions.
  • Ensure data quality reliability and robustness through monitoring validation and automation.
  • Implement CI/CD pipelines for data workflows using Cloud Build Git and Terraform.
  • Optimize cost performance and scalability across GCP data services.
  • Ensure security best practices IAM policies and compliance with organizational standards.

Skills:

  • Digital: Big Data and Hadoop Ecosystems
  • Digital: Google Data Engineering


GCP Data Engineer (BigQuery Dataflow Composer)Role Description:Design and build scalable secure and high-performance data pipelines on GCP.Develop and optimize ETL/ELT workflows using Cloud Composer Dataflow Dataproc and BigQuery.Implement data ingestion frameworks for batch and streaming data (Pub/...
View more view more

Company Industry

IT Services and IT Consulting

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala