Senior Data Engineer - GCP Native Platform
Location:Fully Remote (Pan - India)
Role Summary
Seeking an experienced Senior Data Engineer to design and build scalable data pipelines on Google Cloud Platform. You will implement data solutions using BigQuery Dataflow Cloud Composer and modern engineering practices while mentoring junior team members and driving technical excellence.
Key Responsibilities
Pipeline Development & Engineering
Build and optimize batch and streaming data pipelines usingDataflow (Apache Beam).
Develop and maintainAirflow DAGsinCloud Composerfor workflow orchestration.
Implement ELT processes usingDataform(SQLX) for transformations testing and documentation.
Design and maintainBigQuerydatasets following medallion architecture (Bronze/Silver/Gold).
Create reusable pipeline templates and frameworks to accelerate development.
Data Platform Implementation
BigQuery:Optimized schemas partitioning strategies clustering materialized views and performance tuning.
Dataflow:Build Apache Beam pipelines inPythonand/orJavafor complex transformations.
Cloud Composer:Manage Airflow workflows with dependency management and scheduling.
Dataform:Author SQLX transformations tests and documentation.
Pub/Sub:Implement event-driven ingestion and CDC patterns.
Cloud Storage (GCS):Architect data lake structures access patterns and lifecycle policies.
Technical Delivery
Partner with architects on technical design and platform standards.
Conduct code reviews and enforce engineering best practices.
Troubleshoot and optimize pipeline performance and cost.
Implement data quality checks monitoring and observability.
Support production deployments and incident resolution.
Collaboration & Mentorship
Mentor junior engineers on GCP best practices and data engineering patterns.
Collaborate with analysts data scientists and cross-functional teams to enable data consumption.
Document technical solutions and maintain knowledge-base artifacts.
Participate in agile ceremonies and coordinate with onshore/offshore teams.
Required Qualifications
Experience
5 years of Data Engineering experience; minimum 2 years working withGCP.
Hands-on experience withBigQueryat enterprise scale and production pipelines.
Strong background in building production-grade batch and streaming pipelines.
Development experience inPythonand/orJava.
Proven track record of delivering complex data solutions.
Core Technical Skills
BigQuery (SQL optimization scripting stored procedures partitioning).
Apache Beam / Dataflow (batch & streaming).
Airflow / Cloud Composer (DAG development).
SQL and Python programming proficiency.
Git CI/CD pipelines and basic DevOps familiarity.
Additional Skills
Dataform or dbt experience.
Infrastructure-as-Code (Terraform basics).
Data modeling and schema design.
API integration (REST) and system interfacing.
Cloud Monitoring logging and data quality frameworks.
Preferred / Nice-to-Have
GCP Professional Data Engineercertification.
Experience withVertex AIfor ML pipeline integration.
Cloud Functionsfor serverless processing.
Dataproc/ Spark workloads.
Real-time streaming architectures and low-latency processing.
Familiarity with data governance tools (Atlan Collibra Dataplex).
Experience with legacy system migrations.
BI tools experience (Power BI Looker).
Containerization (Docker) and orchestration (Kubernetes).
What Were Looking For
Strong problem-solving and analytical skills.
Self-motivated and able to work independently in a remote environment.
Excellent communication documentation and stakeholder management skills.
Passion for data engineering continuous learning and mentoring others.
Collaborative team player with a delivery-focused mindset.
Required Skills:
DOCUMENTATIONKUBERNETESGITAPACHE BEAMDEVOPSPOWER BIPYTHON PROGRAMMINGSQLCI/CDTESTINGBIGQUERY
Senior Data Engineer - GCP Native PlatformLocation:Fully Remote (Pan - India) Role SummarySeeking an experienced Senior Data Engineer to design and build scalable data pipelines on Google Cloud Platform. You will implement data solutions using BigQuery Dataflow Cloud Composer and modern engineering ...
Senior Data Engineer - GCP Native Platform
Location:Fully Remote (Pan - India)
Role Summary
Seeking an experienced Senior Data Engineer to design and build scalable data pipelines on Google Cloud Platform. You will implement data solutions using BigQuery Dataflow Cloud Composer and modern engineering practices while mentoring junior team members and driving technical excellence.
Key Responsibilities
Pipeline Development & Engineering
Build and optimize batch and streaming data pipelines usingDataflow (Apache Beam).
Develop and maintainAirflow DAGsinCloud Composerfor workflow orchestration.
Implement ELT processes usingDataform(SQLX) for transformations testing and documentation.
Design and maintainBigQuerydatasets following medallion architecture (Bronze/Silver/Gold).
Create reusable pipeline templates and frameworks to accelerate development.
Data Platform Implementation
BigQuery:Optimized schemas partitioning strategies clustering materialized views and performance tuning.
Dataflow:Build Apache Beam pipelines inPythonand/orJavafor complex transformations.
Cloud Composer:Manage Airflow workflows with dependency management and scheduling.
Dataform:Author SQLX transformations tests and documentation.
Pub/Sub:Implement event-driven ingestion and CDC patterns.
Cloud Storage (GCS):Architect data lake structures access patterns and lifecycle policies.
Technical Delivery
Partner with architects on technical design and platform standards.
Conduct code reviews and enforce engineering best practices.
Troubleshoot and optimize pipeline performance and cost.
Implement data quality checks monitoring and observability.
Support production deployments and incident resolution.
Collaboration & Mentorship
Mentor junior engineers on GCP best practices and data engineering patterns.
Collaborate with analysts data scientists and cross-functional teams to enable data consumption.
Document technical solutions and maintain knowledge-base artifacts.
Participate in agile ceremonies and coordinate with onshore/offshore teams.
Required Qualifications
Experience
5 years of Data Engineering experience; minimum 2 years working withGCP.
Hands-on experience withBigQueryat enterprise scale and production pipelines.
Strong background in building production-grade batch and streaming pipelines.
Development experience inPythonand/orJava.
Proven track record of delivering complex data solutions.
Core Technical Skills
BigQuery (SQL optimization scripting stored procedures partitioning).
Apache Beam / Dataflow (batch & streaming).
Airflow / Cloud Composer (DAG development).
SQL and Python programming proficiency.
Git CI/CD pipelines and basic DevOps familiarity.
Additional Skills
Dataform or dbt experience.
Infrastructure-as-Code (Terraform basics).
Data modeling and schema design.
API integration (REST) and system interfacing.
Cloud Monitoring logging and data quality frameworks.
Preferred / Nice-to-Have
GCP Professional Data Engineercertification.
Experience withVertex AIfor ML pipeline integration.
Cloud Functionsfor serverless processing.
Dataproc/ Spark workloads.
Real-time streaming architectures and low-latency processing.
Familiarity with data governance tools (Atlan Collibra Dataplex).
Experience with legacy system migrations.
BI tools experience (Power BI Looker).
Containerization (Docker) and orchestration (Kubernetes).
What Were Looking For
Strong problem-solving and analytical skills.
Self-motivated and able to work independently in a remote environment.
Excellent communication documentation and stakeholder management skills.
Passion for data engineering continuous learning and mentoring others.
Collaborative team player with a delivery-focused mindset.
Required Skills:
DOCUMENTATIONKUBERNETESGITAPACHE BEAMDEVOPSPOWER BIPYTHON PROGRAMMINGSQLCI/CDTESTINGBIGQUERY
View more
View less