Python & Snowflake Data Engineer | Data Pipelines, Cloud Data Platform, SQL, Automation

Synechron

Not Interested
Bookmark
Report This Job

profile Job Location:

Mumbai - India

profile Monthly Salary: Not Disclosed
Posted on: 30+ days ago
Vacancies: 1 Vacancy

Job Summary

Job Summary

Synechron is seeking a skilled Python and Snowflake Developer to join our data engineering team. The successful candidate will design develop and maintain scalable data solutions leveraging Python and Snowflake to support organizational analytics reporting and data management initiatives. This role plays a vital part in transforming data into actionable insights ensuring data accuracy and optimizing data workflows to meet evolving business needs. The ideal candidate will possess strong technical skills problem-solving abilities and a collaborative mindset to drive strategic data projects forward.

Software Requirements

Required Skills:

  • Proficiency in Python development (experience with data processing scripting and automation)

  • Hands-on experience with Snowflake data platform including data modeling query optimization and data loading techniques

  • Familiarity with SQL for data manipulation and querying within Snowflake

  • Knowledge of cloud platforms especially experience working with Snowflake in a cloud environment (Azure AWS or GCP)

  • Experience with data integration tools and APIs (REST/SOAP) knowledge is a plus

Preferred Skills:

  • Experience with data pipeline orchestration tools like Apache Airflow or similar

  • Working knowledge of additional programming languages such as Java or Scala is advantageous

  • Familiarity with version control systems like Git

  • Exposure to containerization and deployment tools (Docker Kubernetes)

Overall Responsibilities

  • Design develop and implement scalable data pipelines and transformation workflows using Python and Snowflake

  • Collaborate with business stakeholders data analysts and data scientists to translate requirements into effective technical solutions

  • Optimize Snowflake data models and queries for performance reliability and cost-efficiency

  • Develop and maintain automated data loading and extraction processes from external systems

  • Perform testing validation and troubleshooting of data workflows and reports

  • Contribute to documentation of data architecture processes and best practices

  • Stay current on emerging data platform features Python libraries and cloud data engineering trends to continuously improve solutions

Technical Skills (By Category)

Programming Languages:

  • Required: Python (for data processing automation scripting)

  • Preferred: SQL (especially for Snowflake query optimization) Java/Scala (for broader data pipeline development)

Databases/Data Management:

  • Snowflake (data warehousing modeling query optimization)

  • Familiarity with relational databases and data lakes

Cloud Technologies:

  • Snowflake cloud platformknowledge of integration with cloud providers like AWS Azure or GCP

Frameworks and Libraries:

  • Python libraries such as pandas NumPy pyarrow and snowflake-connector-python

  • Data pipeline frameworks (e.g. Apache Airflow Luigi) (preferred)

Development Tools and Methodologies:

  • Python IDEs (e.g. VS Code PyCharm)

  • Version control systems (e.g. Git)

  • Agile methodologies and SDLC applied to data projects

Security Protocols:

  • Knowledge of data security access controls and compliance considerations within cloud data platforms

Experience Requirements

  • 3 years of experience developing data pipelines and solutions using Python and Snowflake

  • Proven track record of designing scalable data models and ETL/ELT workflows

  • Experience integrating data from external systems and IoT/Streaming sources is beneficial

  • Familiarity with cloud platform operations billing and cost management within Snowflake

  • Experience working with cross-functional teams in Agile environments

Alternative Experience Pathways:
Candidates with extensive hands-on experience in data engineering even without formal certifications will be considered if they demonstrate relevant skills and project success.

Day-to-Day Activities

  • Develop and maintain automated data workflows using Python scripts and Snowflake features

  • Collaborate on requirements gathering and design data models to support analytics and reporting needs

  • Optimize existing queries data pipelines and workflows for performance and cost-efficiency

  • Monitor data pipeline health troubleshoot issues and implement improvements

  • Coordinate with data analysts data scientists and business units to ensure data availability and quality

  • Document data architecture workflows and best practices for team reference

  • Explore new tools libraries and platform features to enhance data processing capabilities

Qualifications

  • Bachelors degree in Computer Science Information Technology Data Science or related field

  • Relevant professional experience or certifications in Python development and Snowflake is preferred

  • Continued learning via online courses certifications or workshops (e.g. SnowPro certification) is encouraged

Professional Competencies

  • Strong analytical and critical-thinking skills for designing effective data solutions

  • Excellent communication abilities to collaborate with technical and non-technical stakeholders

  • Ability to work effectively in a team environment and participate in Agile practices

  • Adaptability to evolving data technologies and project requirements

  • A proactive approach to problem-solving and process improvement

  • Effective time management to prioritize workload and meet deadlines

SYNECHRONS DIVERSITY & INCLUSION STATEMENT

Diversity & Inclusion are fundamental to our culture and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity Equity and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger successful businesses as a global company. We encourage applicants from across diverse backgrounds race ethnicities religion age marital status gender sexual orientations or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements mentoring internal mobility learning and development programs and more.


All employment decisions at Synechron are based on business needs job requirements and individual qualifications without regard to the applicants gender gender identity sexual orientation race ethnicity disabled or veteran status or any other characteristic protected by law.

Candidate Application Notice

Job SummarySynechron is seeking a skilled Python and Snowflake Developer to join our data engineering team. The successful candidate will design develop and maintain scalable data solutions leveraging Python and Snowflake to support organizational analytics reporting and data management initiatives....
View more view more

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala

About Company

Company Logo

Chez Synechron, nous croyons en la puissance du numérique pour transformer les entreprises en mieux. Notre cabinet de conseil mondial combine la créativité et la technologie innovante pour offrir des solutions numériques de premier plan. Les technologies progressistes et les stratégie ... View more

View Profile View Profile