drjobs Senior Data Engineer

Senior Data Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Bengaluru - India

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Job Description:

Job Title: Lead GCP Data Engineer (Senior Level)

Reports to: SVP Head of Data Technology& Analytics
Location: Remote Global (must be available through 2p.m. U.S. Eastern Time)
Employment Type: Full-time Long-term Contract (Annual Renewal)

Key Responsibilities

Data Engineering & Development

  • Design build and optimize scalable ELT/ETL pipelines to process structured and unstructured data across batch and streaming systems.
  • Architect and deploy cloud-native data workflows using GCP services including BigQuery Cloud Storage Cloud Functions Cloud Pub/Sub Dataflow and Cloud Composer.
  • Build high-throughput Apache Spark workloads in Python and SQL with performance tuning for scale and cost.
  • Develop parameterized DAGs in Apache Airflow with retry logic alerting SLA/SLO enforcement and robust monitoring.
  • Build reusable frameworks for high-volume API ingestion transforming Postman collections into production-ready Python modules.
  • Translate business and product requirements into scalable efficient data systems that are reliable and secure.

Cloud Infrastructure & Security

  • Implement IAM and VPC-based security to manage and deploy GCP infrastructure for secure data operations.
  • Ensure robustness scalability and cost-efficiency of all infrastructure following FinOps best practices.
  • Apply automation through CI/CD pipelines using tools like Git Jenkins or Bitbucket.

Data Quality Governance & Optimization

  • Design and implement data quality frameworks monitoring validation and anomaly detection.
  • Build observability dashboards to ensure pipeline health and proactively address issues.
  • Ensure compliance with data governance policies privacy regulations and security standards.

Collaboration & Project Delivery

  • Work closely with cross-functional stakeholders including data scientists analysts DevOps product managers and business teams.
  • Effectively communicate technical solutions to non-technical stakeholders.
  • Manage multiple concurrent projects shifting priorities quickly and delivering under tight timelines.
  • Collaborate within a globally distributed team with real-time engagement through 2 p.m. U.S. Eastern Time.

Qualifications & Certifications

Education

  • Bachelors or Masters degree in Computer Science Information Technology Engineering or a related field.

Experience

  • Minimum 7 years in data engineering with 5 years of hands-on experience on GCP.
  • Proven track record with tools and services like BigQuery Cloud Composer (Apache Airflow) Cloud Functions Pub/Sub Cloud Storage Dataflow and IAM/VPC.
  • Demonstrated expertise in Apache Spark (batch and streaming) PySpark and building scalable API integrations.
  • Advanced Airflow skills including custom operators dynamic DAGs and workflow performance tuning.

Certifications

  • Google Cloud Professional Data Engineer certification preferred.

Key Skills

Mandatory Technical Skills

  • Advanced Python (PySpark Pandas pytest) for automation and data pipelines.
  • Strong SQL with experience in window functions CTEs partitioning and optimization.
  • Proficiency in GCP services including BigQuery Dataflow Cloud Composer Cloud Functions and Cloud Storage.
  • Hands-on with Apache Airflow including dynamic DAGs retries and SLA enforcement.
  • Expertise in API data ingestion Postman collections and REST/GraphQL integration workflows.
  • Familiarity with CI/CD workflows using Git Jenkins or Bitbucket.
  • Experience with infrastructure security and governance using IAM and VPC.

Nice-to-Have Skills

  • Experience with Terraform or Kubernetes (GKE).
  • Familiarity with data visualization tools such as Looker or Tableau.
  • Exposure to MarTech/AdTech data sources and campaign analytics.
  • Knowledge of machine learning workflows and their integration with data pipelines.
  • Experience with other cloud platforms like AWS or Azure.

Soft Skills

  • Strong problem-solving and critical-thinking abilities.
  • Excellent verbal and written communication skills to engage technical and non-technical stakeholders.
  • Proactive and adaptable with a continuous learning mindset.
  • Ability to work independently as well as within a collaborative distributed team.

Working Hours

  • Must be available for real-time collaboration with U.S. stakeholders every business day through 2 p.m. U.S. Eastern Time (minimum 4-hour overlap).

Location:

DGS India - Bengaluru - Manyata H2 block

Brand:

Merkle

Time Type:

Full time

Contract Type:

Permanent

Required Experience:

Senior IC

Employment Type

Full-Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.