We are seeking a highly skilled and motivated Lead GCP Data Engineer to join our team. The role is critical to the development of a cuttingedge enterprise data products and solutions.
The GCP Data Engineer will design implement and maintain scalable reliable and efficient data solutions on Google Cloud Platform (GCP). The role focuses on enabling datadriven decisionmaking by developing ETL/ELT pipelines managing largescale datasets and optimizing data workflows. The ideal candidate is a proactive problemsolver with strong technical expertise in GCP a passion for data engineering and a commitment to delivering highquality solutions aligned with business needs.
Job Description:
Key Responsibilities:
Data Engineering & Development:
- Design build and maintain scalable ETL/ELT pipelines for ingesting processing and transforming structured and unstructured data.
- Implement enterpriselevel data solutions using GCP services such as BigQuery Cloud Storage Dataflow Cloud Functions Cloud Pub/Sub and Cloud Composer.
- Develop and optimize data architectures that support realtime and batch data processing.
Cloud Infrastructure Management:
- Manage and deploy GCP infrastructure components to enable seamless data workflows.
- Ensure data solutions are robust scalable and costeffective leveraging GCP best practices.
Collaboration and Stakeholder Engagement:
- Work closely with crossfunctional teams including data analysts data scientists DevOps and business stakeholders to deliver data projects aligned with business goals.
- Translate business requirements into scalable technical solutions while collaborating with team members to ensure successful implementation.
Quality Assurance & Optimization:
- Implement best practices for data governance security and privacy ensuring compliance with organizational policies and regulations.
- Conduct thorough quality assurance including testing and validation to ensure the accuracy and reliability of data pipelines.
- Monitor and optimize pipeline performance to meet SLAs and minimize operational costs.
Qualifications and Certifications:
- Education:
- Bachelors or Masters degree in Computer Science Information Technology Engineering or a related field.
- Experience:
- Minimum of 5 years of experience in data engineering with at least 3 years working on GCP cloud platforms.
- Proven experience designing and implementing data workflows using GCP services like BigQuery Cloud Dataflow Dataform Cloud Pub/Sub and Cloud Composer.
- Certifications:
- Google Cloud Professional Data Engineer certification preferred.
Key Skills:
- Mandatory Skills:
- Advanced proficiency in Python for data pipelines and automation.
- Strong SQL skills for querying transforming and analyzing large datasets.
- Expertise in GCP services such as BigQuery Cloud Functions DBT Cloud Storage Dataflow and Kubernetes (GKE).
- Handson experience with CI/CD tools such as Jenkins Git or Bitbucket.
- Familiarity with workflow orchestration tools like Apache Airflow or Cloud Composer.
- NicetoHave Skills:
- Experience with other cloud platforms like AWS or Azure.
- Knowledge of data visualization tools (e.g. Looker Tableau).
- Understanding of machine learning workflows and their integration with data pipelines.
Soft Skills:
- Strong problemsolving and criticalthinking abilities.
- Excellent communication skills to collaborate with technical and nontechnical stakeholders.
- Proactive attitude towards innovation and learning.
- Ability to work independently and as part of a collaborative team.
Location:
Mumbai
Brand:
Merkle
Time Type:
Full time
Contract Type:
Permanent