drjobs Lead Data Engineer - R

Lead Data Engineer - R

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Bangalore - India

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Lead Data Engineer

Primary Skills

    • GCP Big Query Python Airflow SQL DBT

Job requirements

    • About the Role
    • We are seeking a Senior Data Engineer with deep expertise in Google Cloud Platform (GCP) and BigQuery to lead cloud modernization initiatives develop scalable data pipelines and enable real-time data processing for enterprise-level systems. This is a high-impact role focused on driving the transformation of legacy infrastructure into a robust cloud-native data ecosystem.

    • Key Responsibilities
    • 1. Data Migration & Cloud Modernization
    • Analyze legacy on-premises and hybrid cloud data warehouse environments (e.g. SQL Server).
    • Lead the migration of large-scale datasets to Google BigQuery.
    • Design and implement data migration strategies ensuring data quality integrity and performance.
    • 2. Data Integration & Streaming
    • Integrate data from various structured and unstructured sources including APIs relational databases and IoT devices.
    • Build real-time streaming pipelines for large-scale ingestion and processing of IoT and telemetry data.
    • 3. ETL / Data Pipeline Development
    • Modernize and refactor legacy SSIS packages into cloud-native ETL pipelines.
    • Develop scalable reliable workflows using Apache Airflow Python Spark and GCP-native tools.
    • Ensure high-performance data transformation and loading into BigQuery for analytical use cases.
    • 4. Programming & Query Optimization
    • Write and optimize complex SQL queries stored procedures and scheduled jobs within BigQuery.
    • Develop modular reusable transformation scripts using Python Java Spark and SQL.
    • Continuously monitor and optimize query performance and cost efficiency in the cloud data environment.

    • Required Skills & Experience
    • 5 years in Data Engineering with a strong focus on cloud and big data technologies.
    • Minimum 2 years of hands-on experience with GCP specifically BigQuery.
    • Proven experience migrating on-premise data systems to the cloud.
    • Strong development experience with Apache Airflow Python and Apache Spark.
    • Expertise in streaming data ingestion particularly in IoT or sensor data environments.
    • Strong SQL development skills; experience with BigQuery performance tuning.
    • Solid understanding of cloud architecture data modeling and data warehouse design.
    • Familiarity with Git and CI/CD practices for managing data pipelines.

    • Preferred Qualifications
    • GCP Professional Data Engineer certification.
    • Experience with modern data stack tools like dbt Kafka or Terraform.
    • Exposure to ML pipelines analytics engineering or DataOps/DevOps methodologies.

    • Why Join Us
    • Work with cutting-edge technologies in a fast-paced collaborative environment.
    • Lead cloud transformation initiatives at scale.
    • Competitive compensation and benefits.
    • Remote flexibility and growth opportunities.

Employment Type

Full-Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.