Data Engineer | GCP, Big Data, Spark, ETLELT, Data Security, Cloud Infrastructure

Synechron

Not Interested
Bookmark
Report This Job

profile Job Location:

Gurgaon - India

profile Monthly Salary: Not Disclosed
Posted on: 30+ days ago
Vacancies: 1 Vacancy

Job Summary

Job Summary
Synechron is seeking a highly skilled Data Engineer specializing in Google Cloud Platform (GCP) and Big Data technologies to architect develop and optimize scalable data pipelines and data management solutions. This role is pivotal in supporting enterprise data initiatives ensuring data quality security and accessibility for critical business insights. The successful candidate will collaborate across teams to drive data platform innovations and operational excellence aligned with organizational goals.

Software Requirements

  • Required:

    • Extensive experience with GCP services including BigQuery Dataflow Cloud Storage and Cloud Pub/Sub

    • Strong proficiency in Apache Spark for distributed data processing and analytics

    • Hands-on expertise in building and maintaining data pipelines using ETL/ELT processes

    • Proficiency in Python for data scripting automation and orchestration tasks

    • Experience with distributed data storage and management including relational and NoSQL databases (PostgreSQL MySQL MongoDB)

    • Familiarity with version control tools such as Git

    • Knowledge of Linux/Unix environments for data processing and scripting

  • Preferred:

    • Experience with data governance metadata management and data security best practices

    • Knowledge of data orchestration tools like Apache Airflow or Prefect

    • Understanding of containerization (Docker) and orchestration (Kubernetes) for data deployment

Overall Responsibilities

  • Design develop and maintain scalable resilient data pipelines on GCP to support enterprise analytics and reporting solutions

  • Collaborate with business stakeholders data scientists and analytics teams to understand data requirements and implement optimized data workflows

  • Implement and enforce data quality security and governance standards across platforms

  • Optimize data ingestion transformation and processing workflows to ensure high performance and cost efficiency

  • Perform data profiling troubleshooting and resolution of pipeline issues to ensure operational reliability

  • Stay current with emerging data engineering best practices tools and industry trends leading continuous improvement initiatives

Technical Skills (By Category)

  • Programming Languages:
    Required: Python for scripting and orchestration
    Preferred: SQL Java (for integration or data processing tasks)

  • Databases & Data Management:
    BigQuery PostgreSQL MySQL MongoDB data modeling data security and query optimization

  • Cloud Technologies:
    GCP services including BigQuery Dataflow Cloud Storage Pub/Sub IAM and Cloud Function deployment

  • Frameworks & Libraries:
    Apache Spark Dataflow Airflow (preferred) TensorFlow or PyTorch (if ML integrations are involved)

  • Development & Orchestration Tools:
    Git Jenkins Docker Kubernetes Terraform Apache Airflow and other CI/CD tools

  • Security & Compliance:
    Implementing security policies data encryption access controls and compliance standards such as GDPR or HIPAA

Experience Requirements

  • Minimum of 6 years of practical experience in data engineering with a significant focus on GCP and Big Data ecosystems

  • Hands-on experience designing and implementing end-to-end data pipelines and workflows at scale

  • Proven expertise in distributed data processing data security and infrastructure automation

  • Experience working in agile teams supporting enterprise or large-scale data platforms

  • Industry experience in finance healthcare retail or technology sectors is advantageous

Day-to-Day Activities

  • Develop and optimize data pipelines ensuring high performance data quality and security

  • Collaborate with analytics data science and cross-functional teams to translate business requirements into scalable data solutions

  • Automate data workflows manage infrastructure as code and support deployment pipelines

  • Perform data profiling and troubleshooting to resolve pipeline issues promptly

  • Implement security best practices for data access encryption and compliance

  • Document architecture workflows procedures and operational guidelines

  • Participate in sprint planning reviews and continuous improvement initiatives

Qualifications

  • Bachelors or Masters degree in Computer Science Data Science Information Technology or a related field

  • Extensive experience with GCP and Big Data processing frameworks such as Hadoop Spark and Dataflow

  • Certifications in GCP (e.g. Professional Data Engineer) AWS or Azure are a plus

  • Proven ability to develop robust scalable and secure data pipelines supporting enterprise analytics

Professional Competencies

  • Strong analytical and problem-solving skills especially related to distributed data systems

  • Excellent collaboration and communication skills to work effectively across teams and stakeholders

  • Leadership qualities to guide data engineering best practices and mentor junior team members

  • Strategic mindset to align data platform development with organizational goals

  • Continuous learner to stay updated with evolving data technologies and industry standards

  • Effective time management skills to handle multiple data projects simultaneously

SYNECHRONS DIVERSITY & INCLUSION STATEMENT

Diversity & Inclusion are fundamental to our culture and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity Equity and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger successful businesses as a global company. We encourage applicants from across diverse backgrounds race ethnicities religion age marital status gender sexual orientations or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements mentoring internal mobility learning and development programs and more.


All employment decisions at Synechron are based on business needs job requirements and individual qualifications without regard to the applicants gender gender identity sexual orientation race ethnicity disabled or veteran status or any other characteristic protected by law.

Candidate Application Notice


Required Experience:

IC

Job SummarySynechron is seeking a highly skilled Data Engineer specializing in Google Cloud Platform (GCP) and Big Data technologies to architect develop and optimize scalable data pipelines and data management solutions. This role is pivotal in supporting enterprise data initiatives ensuring data q...
View more view more

About Company

Company Logo

Chez Synechron, nous croyons en la puissance du numérique pour transformer les entreprises en mieux. Notre cabinet de conseil mondial combine la créativité et la technologie innovante pour offrir des solutions numériques de premier plan. Les technologies progressistes et les stratégie ... View more

View Profile View Profile