As a Data Engineer GCP you will play a crucial role in designing implementing and optimizing our data infrastructure working closely with crossfunctional teams to drive impactful projects. This is an exciting opportunity to leverage your expertise in GCP while contributing to a company that values technological advancement and societal improvement.
Tasks
- Design develop and maintain scalable data pipelines on Google Cloud Platform (GCP) to support business analytics and reporting needs.
- Collaborate with crossfunctional teams to gather and analyze requirements for data integration and transformation processes.
- Implement data quality checks and validation routines to ensure the accuracy and reliability of data across all systems.
- Optimize and monitor data workflows for performance and efficiency ensuring minimal downtime and maximum throughput.
- Stay updated with the latest GCP tools and technologies to continuously improve data infrastructure and processes.
Requirements
- Expertise in GCP particularly datarelated services such as BigQuery Dataflow and Dataproc
- Proficiency in SQL
- Indepth knowledge of one or more programming languages preferably Python
- Expertise in database design and data modeling
- Expertise in DevOps Terraform CI/CD
- Experience with the Agile Way of Working
- Big bonus if familiar with Kafka Azure Servicebus Airflow and DBT
- Cloud certification: GCP