Snowflake & Python Data Engineer | Cloud Data Pipelines | Data Modeling & Automation

Synechron

Not Interested
Bookmark
Report This Job

profile Job Location:

Bengaluru - India

profile Monthly Salary: Not Disclosed
Posted on: 20 hours ago
Vacancies: 1 Vacancy

Job Summary

Job Summary

Synechron is seeking a highly experienced Snowflake & Python Data Engineer to design develop and support scalable high-performance data solutions leveraging cloud data platforms. This role involves building robust data pipelines transforming large datasets and automating data workflows to enable enterprise analytics and reporting. The ideal candidate will have strong expertise in Snowflake Python SQL and data modeling collaborating with data engineering analytics and business teams to deliver scalable secure and efficient data architectures aligned with organizational goals.

Software Requirements

Required:

  • Extensive hands-on experience with Snowflake for data warehousing including schemas views and performance optimization techniques.

  • Strong proficiency in Python for data processing automation and integrations.

  • Advanced skills in SQL / PLSQL for data querying transformation and performance tuning.

  • Knowledge of ETL/ELT concepts and data pipeline orchestration tools.

  • Experience with data integration between Snowflake and upstream/downstream systems.

  • Familiarity with cloud platforms (preferably AWS Azure or GCP) for deploying and managing data solutions.

  • Experience with version control tools like Git and automation tools in CI/CD pipelines.

Preferred:

  • Exposure to Terraform or CloudFormation for infrastructure provisioning.

  • Experience using data orchestration tools such as Apache Airflow or Prefect.

  • Knowledge of data security protocols role-based access and compliance standards.

  • Experience with containerization (Docker) and orchestration (Kubernetes) for data workflows.

Overall Responsibilities

  • Design develop and optimize large-scale data pipelines and transformation workflows on Snowflake and cloud platforms.

  • Build and maintain scalable efficient and secure data models and databases supporting analytics and reporting.

  • Develop and fine-tune Python scripts for data automation validation and integration tasks.

  • Collaborate with data engineers analytics teams and business stakeholders to understand data requirements and deliver insights.

  • Monitor the performance of data pipelines and optimize queries and workloads for efficiency.

  • Implement data quality governance processes and security best practices across data platforms.

  • Automate deployment testing and infrastructure provisioning using CI/CD and IaC tools.

  • Stay updated on emerging data technologies tools and best practices recommending improvements.

Performance Outcomes:
Reliable scalable and secure data platforms enabling faster insights better decision-making and improved operational efficiencies.

Technical Skills (By Category)

Programming Languages:

  • Essential: Python SQL PL/SQL

  • Preferred: Shell scripting Java or other languages for automation

Data & Database Management:

  • Essential: Snowflake data warehouse data modeling (star/snowflake schemas)

  • Preferred: NoSQL databases (MongoDB DynamoDB)

Cloud Technologies:

  • Preferred: AWS (S3 Lambda Glue) Azure GCP

Frameworks & Libraries:

  • Essential: SQLAlchemy Pandas NumPy DataFrames in Python

  • Preferred: Airflow Apache Spark PySpark

Development Tools & Methodologies:

  • Essential: Git CI/CD pipelines Terraform Docker

  • Preferred: Kubernetes CloudFormation DataOps tools

Security & Compliance:

  • Strong understanding of data security practices encryption role-based access controls and regulatory standards like GDPR or HIPAA.

Experience Requirements

  • Minimum of 7 years supporting enterprise data systems with a focus on Snowflake and Python-based data solutions.

  • Proven experience designing deploying and managing large-scale data pipelines in cloud environments.

  • Extensive hands-on experience optimizing data queries and models in Snowflake.

  • Demonstrable success building automated data workflows integrations and transformations.

  • Experience working in agile fast-paced enterprise environments preferably in finance healthcare or retail sectors.

Day-to-Day Activities

  • Develop test and deploy scalable data pipelines on Snowflake integrated with cloud platforms.

  • Write and optimize Python scripts and SQL queries for data ingestion transformation and automation.

  • Collaborate with cross-functional teams to understand data needs and deliver reliable solutions.

  • Monitor data pipeline performance and troubleshoot issues in real-time.

  • Perform data quality checks validation and security reviews.

  • Automate infrastructure provisioning and deployment processes as part of CI/CD workflows.

  • Document data architecture schemas workflows and best practices.

  • Review and optimize data workflows regularly incorporating new tools and industry best practices.

Qualifications

  • Bachelors or Masters degree in Computer Science Data Engineering Information Technology or related fields.

  • 7 years of hands-on experience in data engineering with in-depth expertise in Snowflake and Python.

  • Proven track record of designing scalable data pipelines automation and analytics solutions.

  • Certifications in Snowflake cloud platforms or data engineering are advantageous.

  • Strong communication skills for collaborating with technical and business teams.

Professional Competencies

  • Critical thinking and analytical skills for designing efficient and secure data solutions.

  • Leadership and mentorship capabilities to guide junior team members.

  • Excellent stakeholder management and communication skills.

  • Adaptability to emerging technologies cloud services and evolving business requirements.

  • Ownership mindset with a focus on delivering scalable maintainable and compliant data systems.

  • Effective time management to prioritize tasks and meet project deadlines.

SYNECHRONS DIVERSITY & INCLUSION STATEMENT

Diversity & Inclusion are fundamental to our culture and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity Equity and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger successful businesses as a global company. We encourage applicants from across diverse backgrounds race ethnicities religion age marital status gender sexual orientations or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements mentoring internal mobility learning and development programs and more.


All employment decisions at Synechron are based on business needs job requirements and individual qualifications without regard to the applicants gender gender identity sexual orientation race ethnicity disabled or veteran status or any other characteristic protected by law.

Candidate Application Notice


Required Experience:

IC

Job SummarySynechron is seeking a highly experienced Snowflake & Python Data Engineer to design develop and support scalable high-performance data solutions leveraging cloud data platforms. This role involves building robust data pipelines transforming large datasets and automating data workflows to...
View more view more

About Company

Company Logo

Chez Synechron, nous croyons en la puissance du numérique pour transformer les entreprises en mieux. Notre cabinet de conseil mondial combine la créativité et la technologie innovante pour offrir des solutions numériques de premier plan. Les technologies progressistes et les stratégie ... View more

View Profile View Profile