Data Engineer (Python, PySpark, Snowflake, Databricks) | Cloud Data Pipelines, Automation & Compliance

Synechron

Not Interested
Bookmark
Report This Job

profile Job Location:

Pune - India

profile Monthly Salary: Not Disclosed
Posted on: 17 hours ago
Vacancies: 1 Vacancy

Job Summary

Job Summary
Synechron is seeking an experienced Snowflake Support Engineer with strong expertise in Python Databricks and PySpark to support enterprise data platform operations. This role involves managing Snowflake data warehousing solutions optimizing data workflows and providing technical support to ensure high availability security and performance for critical business processes. The ideal candidate will drive automation guide technical teams and stay current with industry trends to enhance data platform reliability and scalability.

Software Requirements

Required Software Proficiency:

  • Python (version 3.7) extensive experience in scripting automation and data processing

  • Databricks hands-on experience in managing and developing data pipelines within Databricks environments

  • PySpark strong skills in large-scale data processing and transformation workflows

  • Snowflake expertise in data warehousing data loading schema design and performance tuning

  • SQL advanced query writing optimization and data validation

  • Git and version control: GitHub Bitbucket for code management and collaboration

  • Cloud platforms (AWS Azure GCP) familiarity with deploying and managing data solutions supported by cloud services

Preferred Software Skills:

  • Automation and orchestration tools: Jenkins Azure DevOps or Terraform for CI/CD and infrastructure automation

  • Monitoring tools: Prometheus Grafana CloudWatch for system health and performance monitoring

  • Data modeling and metadata management tools supporting data governance standards

Overall Responsibilities

  • Support and maintain scalable secure and high-performance data pipelines in Snowflake and Databricks environments

  • Automate operational workflows data load processes and pipeline health monitoring to improve efficiency and reliability

  • Troubleshoot and resolve data ingestion transformation and performance issues promptly

  • Collaborate with data engineers data scientists and business analysts to translate requirements into optimized data workflows

  • Support schema design data validation and data security compliance in line with industry regulations (GDPR HIPAA etc.)

  • Monitor analyze and optimize resource consumption and performance of data workflows

  • Develop and maintain documentation including process flows operational procedures and best practices

  • Lead efforts in automation and continuous delivery through CI/CD pipelines and infrastructure as code frameworks

  • Support data migration system upgrades and support audits for data security and compliance

Technical Skills (By Category)

  • Languages & Data Tools (Essential):

    • Python (3.7) PySpark SQL for scripting data transformation and automation

    • Snowflake data warehousing schema management and query optimization

  • Databases & Data Management:

    • Snowflake relational databases data modeling and schema design

    • Metadata and data lineage tools supporting data governance and compliance

  • Cloud & Infrastructure:

    • AWS Azure or GCP data services supporting cloud-native data architectures (preferred)

    • Infrastructure as Code tools: Terraform CloudFormation (preferred)

  • Monitoring & Automation:

    • Prometheus Grafana CloudWatch for system monitoring and alerting

    • Jenkins Azure DevOps or similar tools for CI/CD pipeline automation

  • Security & Data Governance:

    • Knowledge of encryption standards access controls and compliance frameworks (GDPR/HIPAA)

Experience Requirements

  • 6 years supporting enterprise data platforms with a focus on Snowflake Databricks and cloud data workflows

  • Proven experience optimizing large-scale data pipelines for performance and cost efficiency

  • Demonstrated ability to troubleshoot automate and support high-availability data systems in cloud environments

  • Experience supporting data migration projects schema management and data security compliance in regulated industries (preferred)

  • Familiarity with enterprise metadata management and data governance standards

Day-to-Day Activities

  • Develop support and optimize data pipelines in Snowflake and Databricks environments

  • Automate ingestion transformation and validation workflows supporting business analytics and compliance

  • Monitor system health troubleshoot errors and implement performance tuning and security improvements

  • Collaborate with data engineers data scientists and business stakeholders to refine workflows

  • Support data migration configuration and operational readiness activities

  • Conduct root cause analysis incident response and performance reviews

  • Maintain operational documentation including architecture diagrams runbooks and data policies

  • Support infrastructure automation and CI/CD pipelines for deployment and upgrades

Qualifications

  • Bachelors or Masters degree in Data Engineering Computer Science or related field

  • 6 years of experience supporting deploying and managing enterprise data solutions in cloud environments supported by Snowflake and Databricks

  • Relevant certifications such as SnowPro AWS Data Analytics or GCP Professional Data Engineer are a plus

  • Experience working in regulated industries with data security privacy and compliance requirements

Professional Competencies

  • Strong analytical and troubleshooting skills for complex data systems

  • Leadership qualities to guide junior team members and support best practices in automation and performance optimization

  • Excellent communication for stakeholder engagement and documentation

  • Adaptability to evolving cloud technologies data security standards and industry regulations

  • Focus on operational security data quality and system reliability

  • Time management skills to handle multiple tasks and ensure timely delivery in a fast-paced environment

SYNECHRONS DIVERSITY & INCLUSION STATEMENT

Diversity & Inclusion are fundamental to our culture and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity Equity and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger successful businesses as a global company. We encourage applicants from across diverse backgrounds race ethnicities religion age marital status gender sexual orientations or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements mentoring internal mobility learning and development programs and more.


All employment decisions at Synechron are based on business needs job requirements and individual qualifications without regard to the applicants gender gender identity sexual orientation race ethnicity disabled or veteran status or any other characteristic protected by law.

Candidate Application Notice


Required Experience:

IC

Job SummarySynechron is seeking an experienced Snowflake Support Engineer with strong expertise in Python Databricks and PySpark to support enterprise data platform operations. This role involves managing Snowflake data warehousing solutions optimizing data workflows and providing technical support ...
View more view more

About Company

Company Logo

Chez Synechron, nous croyons en la puissance du numérique pour transformer les entreprises en mieux. Notre cabinet de conseil mondial combine la créativité et la technologie innovante pour offrir des solutions numériques de premier plan. Les technologies progressistes et les stratégie ... View more

View Profile View Profile