Loking for a Senior Data Engineer with strong experience in building and managing data pipelines using Snowflake and AWS. The role involves working with structured and unstructured data ETL processes and enabling downstream applications through clean and scalable data solutions.
MUST HAVE (NON-NEGOTIABLE):
Strong Python (10/10 level)
Hands-on experience with Snowflake (including UI)
Experience with Apache Airflow (workflow scheduling & pipelines)
Strong experience in AWS (S3 Lambda API Gateway etc.)
Experience in ETL processes (data extraction transformation loading)
Ability to handle structured & unstructured data
Experience designing data pipelines for downstream consumption
Key Responsibilities:
Implement advanced Snowflake features like Streams Tasks Snowpipe Data Sharing
Lead migration from legacy data warehouses to Snowflake
Design and build data applications on Snowflake (Streamlit / Native Apps)
Develop and manage data pipelines using Python & Airflow
Work on AWS cloud architecture (S3 Lambda RDS API Gateway)
Extract data from various data stores and load into AWS
Build scalable data models for reporting and analytics
IMPORTANT:
Candidate must be local to Chicago
Hospitality domain experience is a plus
Loking for a Senior Data Engineer with strong experience in building and managing data pipelines using Snowflake and AWS. The role involves working with structured and unstructured data ETL processes and enabling downstream applications through clean and scalable data solutions. MUST HAVE (NON-NEGOT...
Loking for a Senior Data Engineer with strong experience in building and managing data pipelines using Snowflake and AWS. The role involves working with structured and unstructured data ETL processes and enabling downstream applications through clean and scalable data solutions.
MUST HAVE (NON-NEGOTIABLE):
Strong Python (10/10 level)
Hands-on experience with Snowflake (including UI)
Experience with Apache Airflow (workflow scheduling & pipelines)
Strong experience in AWS (S3 Lambda API Gateway etc.)
Experience in ETL processes (data extraction transformation loading)
Ability to handle structured & unstructured data
Experience designing data pipelines for downstream consumption
Key Responsibilities:
Implement advanced Snowflake features like Streams Tasks Snowpipe Data Sharing
Lead migration from legacy data warehouses to Snowflake
Design and build data applications on Snowflake (Streamlit / Native Apps)
Develop and manage data pipelines using Python & Airflow
Work on AWS cloud architecture (S3 Lambda RDS API Gateway)
Extract data from various data stores and load into AWS
Build scalable data models for reporting and analytics
IMPORTANT:
Candidate must be local to Chicago
Hospitality domain experience is a plus
View more
View less