Data Engineer (Python, PySpark, Snowflake, Cloud) | Scalable Data Pipelines & Cloud Migration Support

Synechron

Not Interested
Bookmark
Report This Job

profile Job Location:

Pune - India

profile Monthly Salary: Not Disclosed
Posted on: Yesterday
Vacancies: 1 Vacancy

Job Summary

Job Summary
Synechron is seeking a highly experienced Snowflake Support Engineer with expertise in Python Databricks and PySpark to support enterprise data solutions. This role involves managing and optimizing data pipelines supporting Cloud data platforms and ensuring system security reliability and performance. The successful candidate will collaborate with cross-functional teams to implement best practices improve operational efficiency and ensure data governance and compliance in a cloud environment.

Software Requirements

Required Software Proficiency:

  • Python (version 3.7): extensive experience scripting and automating data workflows

  • Databricks: hands-on experience in managing data pipelines and performing data transformations within Databricks environment

  • PySpark: strong skills in processing large datasets and developing scalable data transformations

  • Snowflake: proficiency in designing deploying and supporting data warehousing solutions including data loading and performance tuning

  • SQL: advanced query writing and data validation skills supporting data reconciliation and monitoring

  • Cloud Platform (AWS Azure or GCP): support cloud data services deployment and integration support (preferred)

Preferred Software Skills:

  • CI/CD tools: Jenkins Azure DevOps or GitLab for automation of deployment workflows

  • Monitoring tools: Prometheus Grafana CloudWatch for system health and performance monitoring

  • Data governance and security tools supporting encryption access controls and compliance standards (GDPR HIPAA)

Overall Responsibilities

  • Support and optimize scalable secure and high-performance data pipelines built within Snowflake and Databricks environments

  • Automate data workflows and pipeline operations to improve efficiency and reduce manual intervention

  • Troubleshoot and resolve data ingestion transformation and performance issues promptly

  • Collaborate with data engineers data scientists and business teams to implement data validation security and reconciliation processes

  • Support data migration activities schema updates and cloud deployment supporting enterprise data strategies

  • Monitor system performance manage incident resolution and optimize resource use for cost efficiency

  • Maintain comprehensive documentation of data pipelines architecture security protocols and operational procedures

  • Support automation and infrastructure as code initiatives supporting scalable data environments

Technical Skills (By Category)

  • Languages & Data Tools (Essential):

    • Python (3.7) PySpark SQL core skills for scripting data processing and validation

    • Snowflake data warehousing schema design and query optimization

  • Databases & Data Management:

    • Snowflake relational databases (Oracle SQL Server PostgreSQL) supporting enterprise data management

    • Data governance metadata management and lineage tools supporting compliance frameworks

  • Cloud & Infrastructure:

    • AWS Azure or GCP cloud services supporting data integration storage and migration (preferred)

    • Infrastructure as Code: Terraform CloudFormation (preferred)

  • Monitoring & Automation:

    • Prometheus Grafana CloudWatch for system monitoring and alerting

    • CI/CD pipelines (Jenkins Azure DevOps) for automation of deployment and testing

  • Security & Data Governance:

    • Knowledge of encryption protocols role-based access control and compliance standards such as GDPR HIPAA

Experience Requirements

  • Minimum of 6 years supporting enterprise cloud data platforms big data pipelines and data warehousing solutions

  • Proven experience designing deploying and optimizing scalable data pipelines on cloud environments supported by Snowflake and Databricks

  • Strong understanding of cloud migration data security and compliance practices in regulated industries (preferred)

  • Hands-on experience with data validation reconciliation and automation of data workflows

  • Demonstrated ability to troubleshoot optimize and support high-availability data systems in enterprise environments

Day-to-Day Activities

  • Develop test and support scalable data pipelines supporting analytics migration and compliance initiatives

  • Collaborate with data scientists application and business teams to refine data workflows and support ongoing improvements

  • Troubleshoot and resolve performance bottlenecks data inconsistencies and security issues

  • Support cloud data migration schema updates and operational automation processes

  • Monitor system health generate operational reports and implement security and compliance measures

  • Maintain technical documentation architecture diagrams and operational runbooks

  • Automate deployment pipelines infrastructure provisioning and system updates using Terraform Jenkins or similar tools

Qualifications

  • Bachelors or Masters degree in Data Engineering Computer Science or related field

  • 6 years supporting enterprise data platforms cloud data migration and scalable data pipeline deployment

  • Relevant certifications: SnowPro AWS Data Analytics GCP Professional Data Engineer Azure Data Engineer are preferred

  • Proven track record of managing large-scale compliant and secure data environments supporting critical business operations

Professional Competencies

  • Strong analytical and troubleshooting skills for complex data workflows and security issues

  • Leadership qualities to guide junior team members and support best practices

  • Effective communication skills to engage stakeholders and document processes clearly

  • Adaptability to evolving cloud technologies security standards and industry regulations

  • Focus on operational security data quality and system reliability

SYNECHRONS DIVERSITY & INCLUSION STATEMENT

Diversity & Inclusion are fundamental to our culture and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity Equity and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger successful businesses as a global company. We encourage applicants from across diverse backgrounds race ethnicities religion age marital status gender sexual orientations or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements mentoring internal mobility learning and development programs and more.


All employment decisions at Synechron are based on business needs job requirements and individual qualifications without regard to the applicants gender gender identity sexual orientation race ethnicity disabled or veteran status or any other characteristic protected by law.

Candidate Application Notice


Required Experience:

IC

Job SummarySynechron is seeking a highly experienced Snowflake Support Engineer with expertise in Python Databricks and PySpark to support enterprise data solutions. This role involves managing and optimizing data pipelines supporting Cloud data platforms and ensuring system security reliability and...
View more view more

About Company

Company Logo

Chez Synechron, nous croyons en la puissance du numérique pour transformer les entreprises en mieux. Notre cabinet de conseil mondial combine la créativité et la technologie innovante pour offrir des solutions numériques de premier plan. Les technologies progressistes et les stratégie ... View more

View Profile View Profile