MUST be in Chicago no relo hybrid
ETL Developer
Long Term Contract
Project Overview:
This is to support the new work forthcoming in Risk:
> Migration of legacy data workloads to Snowflake.
> Development to support source system replacements.
> Murex Risk docket.
Looking for an experienced Snowflake developer responsible for enhancing and supporting data ingestegress integrations with Risk & Compliance Applications.
Contractors Role:
Experience Level:
Senior resource: 10 years
Skills/Qualifications (must haves):
Expertise in Design/Development in Snowflake Python.
Expertise in Data Analysis/Analytics skills.
Expertise in SQL.
Experience with Airflow.
Strong in PL/SQL and UNIX shell scripting.
Experience working with XML transformation and consumption of messages from queues will be a plus.
Hands-on improve operational stability via automation such as auto healing where possible and raise design changes to Data analyst SMEs and/or architects as well as opportunities to application /service managers.
Work some off hours as demanded by projects operations and stakeholders such as release testing and/or critical bug resolutions.
Nice to have:
Experience with ETL tools such as DataStage.
Experience with control-M
Tasks and Responsibilities:
Migrate staging and output layers from legacy platforms (e.g. Oracle SQL Server) to Snowflake-based schemas.
Design and implement scalable data models in Snowflake ensuring alignment with business logic and reporting needs.
Build and maintain Airflow DAGs to automate data movement between Snowflake schemas
Use orchestration tools such as Airflow Azure DevOps and Control-M to manage deployments and scheduling
Ingest data from different sources such as flat files (CSV TXT XML JSON) and relational databases (Oracle SQL Server)
Transform raw data into structured formats based on business requirements ensuring consistency and accuracy across layers.
Clean and restructure unformatted or semi-structured files to make them compatible with Snowflake ingestion pipelines.
Implement error detection and handling logic including creation of error tables to capture rejected records (e.g. duplicates schema mismatches)
Monitor pipeline health and troubleshoot data quality issues proactively.
MUST be in Chicago no relo hybrid ETL Developer Long Term Contract Project Overview: This is to support the new work forthcoming in Risk: > Migration of legacy data workloads to Snowflake. > Development to support source system replacements. > Murex Risk docket. Looking for an experienced S...
MUST be in Chicago no relo hybrid
ETL Developer
Long Term Contract
Project Overview:
This is to support the new work forthcoming in Risk:
> Migration of legacy data workloads to Snowflake.
> Development to support source system replacements.
> Murex Risk docket.
Looking for an experienced Snowflake developer responsible for enhancing and supporting data ingestegress integrations with Risk & Compliance Applications.
Contractors Role:
Experience Level:
Senior resource: 10 years
Skills/Qualifications (must haves):
Expertise in Design/Development in Snowflake Python.
Expertise in Data Analysis/Analytics skills.
Expertise in SQL.
Experience with Airflow.
Strong in PL/SQL and UNIX shell scripting.
Experience working with XML transformation and consumption of messages from queues will be a plus.
Hands-on improve operational stability via automation such as auto healing where possible and raise design changes to Data analyst SMEs and/or architects as well as opportunities to application /service managers.
Work some off hours as demanded by projects operations and stakeholders such as release testing and/or critical bug resolutions.
Nice to have:
Experience with ETL tools such as DataStage.
Experience with control-M
Tasks and Responsibilities:
Migrate staging and output layers from legacy platforms (e.g. Oracle SQL Server) to Snowflake-based schemas.
Design and implement scalable data models in Snowflake ensuring alignment with business logic and reporting needs.
Build and maintain Airflow DAGs to automate data movement between Snowflake schemas
Use orchestration tools such as Airflow Azure DevOps and Control-M to manage deployments and scheduling
Ingest data from different sources such as flat files (CSV TXT XML JSON) and relational databases (Oracle SQL Server)
Transform raw data into structured formats based on business requirements ensuring consistency and accuracy across layers.
Clean and restructure unformatted or semi-structured files to make them compatible with Snowflake ingestion pipelines.
Implement error detection and handling logic including creation of error tables to capture rejected records (e.g. duplicates schema mismatches)
Monitor pipeline health and troubleshoot data quality issues proactively.
View more
View less