GCP Data Engineer with Cloud-Native Architecture and Lakehouse Expertise
Job Summary
Job Summary
Synechron is seeking a skilled and experienced Google Cloud Platform (GCP) Data Engineer to lead data architecture pipeline development and cloud data solutions supporting enterprise analytics and digital initiatives. This role involves designing building and optimizing scalable secure data pipelines leveraging GCP native services such as BigQuery Dataflow Dataproc and Pub/Sub. The candidate will work within cross-functional teams to implement modern lakehouse architectures facilitate data streaming and support enterprise data strategies aligned with organizational goals and regulatory standards.
Software Requirements
Required:
Hands-on experience with GCP data services including BigQuery Dataflow Dataproc and Datastream
Proficiency in Python for data pipeline development orchestration and automation
Strong SQL skills for data modeling validation and query optimization in BigQuery and related databases
Experience in building and managing scalable data pipelines and workflows on GCP
Familiarity with GCP-native architecture components: Pub/Sub IAM Data Catalog and data streaming solutions
Preferred:
Experience with Delta Lake and lakehouse architectures within cloud ecosystems
Knowledge of data governance security and compliance in cloud environments
Exposure to AI-enabled data solutions and data science frameworks
Experience utilizing orchestration tools such as Cloud Composer (based on Apache Airflow)
Overall Responsibilities
Design deploy and optimize scalable and secure data pipelines using GCP data services such as BigQuery Dataflow and Dataproc.
Collaborate with data analysts data scientists and business stakeholders to translate data needs into reliable cloud data architectures.
Manage end-to-end data workflows including ingestion transformation and storage ensuring high data quality and processing efficiency.
Implement and monitor data validation routines ensuring data accuracy security and compliance with data governance policies.
Automate and orchestrate data workflows using Cloud Composer Dataflow and other automation tools to enhance operational efficiency.
Support enterprise data lakes data warehouses and Lakehouse architectures ensuring high performance and cost-effectiveness.
Troubleshoot pipeline issues perform root cause analysis and optimize data processes continually.
Support cloud infrastructure security controls access management and compliance standards.
Stay current with industry advancements in cloud data management analytics and data engineering best practices.
Technical Skills (By Category)
Data Platform & Storage (Essential):
GCP data services: BigQuery Dataflow Dataproc Datastream Cloud Storage Pub/Sub
Familiarity with Lakehouse architectures Delta Lake and data lifecycle management
Data Processing & Automation (Essential):
Python scripting for data pipeline development and automation
SQL queries and data validation techniques in big data environments
Cloud & Architecture (Essential):
GCP native architecture including IAM data streaming security policies and data governance
Orchestration & Workflow Management (Preferred):
Cloud Composer (Apache Airflow) and Dataflow pipeline orchestration
Development & CI/CD (Preferred):
Infrastructure as Code (Terraform Deployment Manager)
CI/CD pipelines via Jenkins GitLab CI or similar tools
Data & Analytics (Preferred):
Familiarity with data lakes data warehouses and Delta Lake integrations
Experience Requirements
3 years supporting or developing data pipelines on GCP cloud environments.
Proven experience designing scalable secure and high-performance data architectures within cloud ecosystems.
Hands-on experience with GCPs data services: BigQuery Dataflow Dataproc Pub/Sub and DataStream.
Strong Python and SQL skills for data processing modeling and validation.
Knowledge of enterprise data governance security and compliance standards in cloud environments.
Supporting experience supporting financial or regulated industry data projects is a plus.
Alternative pathways include extensive hands-on experience in cloud data processing lakehouse architectures and enterprise big data solutions.
Day-to-Day Activities
Design develop and optimize scalable data pipelines across cloud platforms.
Collaborate with data teams and stakeholders to understand data requirements and deliver cloud-based solutions.
Automate data extraction transformation and loading workflows to improve efficiency and reliability.
Monitor data pipeline health and performance troubleshoot issues and implement enhancements.
Support data lake warehouse and lakehouse architecture deployment ensuring security and compliance.
Conduct root cause analysis for data processing failures and performance bottlenecks.
Implement and maintain security controls data privacy standards and governance policies.
Engage with industry trends and emerging cloud data technologies to recommend strategic improvements.
Qualifications
Bachelors or Masters degree in Computer Science Data Engineering or related field.
3 years of experience working in cloud data environments specifically GCP.
Deep knowledge of GCP data services: BigQuery Dataflow Dataproc DataStream Pub/Sub.
Strong Python and SQL skills for data pipeline development and management.
Certifications in GCP (e.g. Google Cloud Professional Data Engineer Architect) are advantageous.
Experience designing and implementing lakehouse architectures or Delta Lake solutions is a plus.
Proven ability to support large-scale data infrastructure with security compliance and operational excellence.
Professional Competencies
Critical thinking and problem-solving skills to troubleshoot data pipeline challenges.
Ability to work independently and within cross-functional teams.
Strong communication to translate complex data needs into technical solutions.
Adaptability to evolving cloud platforms tools and industry standards.
Ownership of data quality security and operational efficiency.
Continuous learner committed to staying updated on cloud data management innovations.
SYNECHRONS DIVERSITY & INCLUSION STATEMENT
Diversity & Inclusion are fundamental to our culture and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity Equity and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger successful businesses as a global company. We encourage applicants from across diverse backgrounds race ethnicities religion age marital status gender sexual orientations or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements mentoring internal mobility learning and development programs and more.
All employment decisions at Synechron are based on business needs job requirements and individual qualifications without regard to the applicants gender gender identity sexual orientation race ethnicity disabled or veteran status or any other characteristic protected by law.
Required Experience:
Staff IC
About Company
Chez Synechron, nous croyons en la puissance du numérique pour transformer les entreprises en mieux. Notre cabinet de conseil mondial combine la créativité et la technologie innovante pour offrir des solutions numériques de premier plan. Les technologies progressistes et les stratégie ... View more