There will a Coding test on Snowflake Schema for 1 Hour
Onsite form day 1.
------
Role: AWS Data Engineer
Location: Denver CO & Charlotte NC - 100% Onsite
6-12 Months Contract -W2/C2C
Job Description:
We are seeking a highly skilled Data Engineer to support the migration and enhancement of data workflows from Tidal to Apache Airflow along with optimizing data processes on Snowflake and AWS. The ideal candidate will have strong experience in shell scripting Python SQL and working in modern data warehouse environments.
Responsibilities:
- Analyze and understand existing Tidal job dependencies schedules and scripts.
- Develop shell scripts and Python-based workflows to replicate and enhance current Tidal processes in Airflow.
- Write and optimize SQL queries for Snowflake-based data transformations and validations.
- Collaborate with cross-functional teams to ensure data pipelines are robust scalable and well-documented.
- Work with AWS S3 for data ingestion and storage processes.
- Ensure high reliability and performance of data pipelines post-migration.
- Provide support during testing and post-migration validation.
Required Skills:
- Shell scripting strong experience required
- Python strong experience required
- SQL strong experience required preferably with Snowflake
- AWS S3 hands-on experience
- Solid Data Warehouse background and understanding of ETL/ELT best practices
Preferred Skills (Nice to Have):
- Experience with Apache Airflow
- Prior experience with scheduling tool migrations especially Tidal
- Familiarity with CI/CD practices and DevOps for data pipelines
If interested to apply please share resume to ;