- Develop and maintain data pipelines using Python Airflow (DAGs) and AWS/Snowflake components.
- Build and automate data ingestion transformation and scheduling workflows.
- Develop Airflow DAGs including custom operators sensors hooks and manage pipeline monitoring.
- Work on Snowflake-based ELT solutions including data loads stored procedures and queries.
- Write efficient SQL queries and optimize performance for data transformations.
- Collaborate with cross-functional teams to understand requirements and deliver scalable data solutions.
- Troubleshoot pipeline failures and ensure high availability of production workflows.
Qualifications :
- 58 years of experience in Python development (advanced scripting and automation).
- 3 years of experience with Apache Airflow (DAG design orchestration scheduling).
- Experience with Snowflake or any cloud data warehouse (Redshift / BigQuery / Databricks).
- Experience with AWS services (S3 Glue Lambda Athena) or equivalent cloud technologies.
- Strong hands-on experience with SQL (advanced querying optimization).
- Experience with ETL/ELT data workflows data validation data quality checks.
- Familiarity with Git / CI-CD JIRA or similar tools.
- Good communication skills and ability to work independently.
- Bachelors degree in Computer Science Engineering or related field (or equivalent experience)
Additional Information :
All your information will be kept confidential according to EEO guidelines.
Remote Work :
No
Employment Type :
Full-time
Develop and maintain data pipelines using Python Airflow (DAGs) and AWS/Snowflake components.Build and automate data ingestion transformation and scheduling workflows.Develop Airflow DAGs including custom operators sensors hooks and manage pipeline monitoring.Work on Snowflake-based ELT solutions in...
- Develop and maintain data pipelines using Python Airflow (DAGs) and AWS/Snowflake components.
- Build and automate data ingestion transformation and scheduling workflows.
- Develop Airflow DAGs including custom operators sensors hooks and manage pipeline monitoring.
- Work on Snowflake-based ELT solutions including data loads stored procedures and queries.
- Write efficient SQL queries and optimize performance for data transformations.
- Collaborate with cross-functional teams to understand requirements and deliver scalable data solutions.
- Troubleshoot pipeline failures and ensure high availability of production workflows.
Qualifications :
- 58 years of experience in Python development (advanced scripting and automation).
- 3 years of experience with Apache Airflow (DAG design orchestration scheduling).
- Experience with Snowflake or any cloud data warehouse (Redshift / BigQuery / Databricks).
- Experience with AWS services (S3 Glue Lambda Athena) or equivalent cloud technologies.
- Strong hands-on experience with SQL (advanced querying optimization).
- Experience with ETL/ELT data workflows data validation data quality checks.
- Familiarity with Git / CI-CD JIRA or similar tools.
- Good communication skills and ability to work independently.
- Bachelors degree in Computer Science Engineering or related field (or equivalent experience)
Additional Information :
All your information will be kept confidential according to EEO guidelines.
Remote Work :
No
Employment Type :
Full-time
View more
View less