Job Summary:
Key Responsibilities:
- Design develop and maintain robust ETL pipelines using Python and Snowflake.
- Collaborate with data architects analysts and business stakeholders to understand data requirements and translate them into technical solutions.
- Optimize data workflows for performance scalability and reliability.
- Implement data quality checks error handling and logging mechanisms.
- Develop and maintain data models schemas and documentation.
- Monitor and troubleshoot ETL jobs and data pipelines in production environments.
- Ensure data security and compliance with governance policies.
- Mentor junior developers and contribute to best practices in data engineering.
Required Qualifications:
- Bachelors or Masters degree in Computer Science Information Systems or related field.
- 5 years of experience in ETL development and data engineering.
- Strong proficiency in Python for data processing and automation.
- Hands-on experience with Snowflake including data warehousing concepts performance tuning and SQL scripting.
- Experience with cloud platforms (e.g. AWS Azure GCP) and data orchestration tools (e.g. Airflow dbt).
- Solid understanding of relational databases data modeling and normalization.
- Familiarity with version control systems (e.g. Git) and CI/CD pipelines.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration abilities.
Preferred Qualifications:
- Experience with real-time data streaming (e.g. Kafka Spark).
- Knowledge of data governance security and compliance frameworks.
- Certifications in Snowflake Python or cloud platforms.
Job Summary: Key Responsibilities: Design develop and maintain robust ETL pipelines using Python and Snowflake. Collaborate with data architects analysts and business stakeholders to understand data requirements and translate them into technical solutions. Optimize data workflows for performance sc...
Job Summary:
Key Responsibilities:
- Design develop and maintain robust ETL pipelines using Python and Snowflake.
- Collaborate with data architects analysts and business stakeholders to understand data requirements and translate them into technical solutions.
- Optimize data workflows for performance scalability and reliability.
- Implement data quality checks error handling and logging mechanisms.
- Develop and maintain data models schemas and documentation.
- Monitor and troubleshoot ETL jobs and data pipelines in production environments.
- Ensure data security and compliance with governance policies.
- Mentor junior developers and contribute to best practices in data engineering.
Required Qualifications:
- Bachelors or Masters degree in Computer Science Information Systems or related field.
- 5 years of experience in ETL development and data engineering.
- Strong proficiency in Python for data processing and automation.
- Hands-on experience with Snowflake including data warehousing concepts performance tuning and SQL scripting.
- Experience with cloud platforms (e.g. AWS Azure GCP) and data orchestration tools (e.g. Airflow dbt).
- Solid understanding of relational databases data modeling and normalization.
- Familiarity with version control systems (e.g. Git) and CI/CD pipelines.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration abilities.
Preferred Qualifications:
- Experience with real-time data streaming (e.g. Kafka Spark).
- Knowledge of data governance security and compliance frameworks.
- Certifications in Snowflake Python or cloud platforms.
View more
View less