Position: Data Engineer Azure Databricks
Location: Seattle WA ***From Day 1 Onsite***
Duration: 1 Years
| | Role Description Design and implement data pipelines in Azure Databricks for ingesting and transforming data from upstream systems including: o SAP o PIPS o POS o Into WFM and UKG Optimize ETL/ELT workflows for performance and scalability. Collaborate with Java/API developers to integrate event-driven triggers into data pipelines. Implement data quality checks schema validation and error handling. Support batch and near real-time data flows for both operational and analytics use cases. Work with Boomi and WFM teams to ensure data contracts and canonical models are enforced. Essential Skills Design and implement data pipelines in Azure Databricks for ingesting and transforming data from upstream systems including: o SAP o PIPS o POS o Into WFM and UKG Optimize ETL/ELT workflows for performance and scalability. Collaborate with Java/API developers to integrate event-driven triggers into data pipelines. Implement data quality checks schema validation and error handling. Support batch and near real-time data flows for operational and analytics use cases. Work with Boomi and WFM teams to ensure data contracts and canonical models are enforced. Desirable Skills: Skills: Digital : Databricks Experience Required: 8-10 |
Position: Data Engineer Azure Databricks Location: Seattle WA ***From Day 1 Onsite*** Duration: 1 Years Role Description Design and implement data pipelines in Azure Databricks for ingesting and transforming data from upstream systems including: o SAP o PIPS o POS o Into WFM and UKG...
Position: Data Engineer Azure Databricks
Location: Seattle WA ***From Day 1 Onsite***
Duration: 1 Years
| | Role Description Design and implement data pipelines in Azure Databricks for ingesting and transforming data from upstream systems including: o SAP o PIPS o POS o Into WFM and UKG Optimize ETL/ELT workflows for performance and scalability. Collaborate with Java/API developers to integrate event-driven triggers into data pipelines. Implement data quality checks schema validation and error handling. Support batch and near real-time data flows for both operational and analytics use cases. Work with Boomi and WFM teams to ensure data contracts and canonical models are enforced. Essential Skills Design and implement data pipelines in Azure Databricks for ingesting and transforming data from upstream systems including: o SAP o PIPS o POS o Into WFM and UKG Optimize ETL/ELT workflows for performance and scalability. Collaborate with Java/API developers to integrate event-driven triggers into data pipelines. Implement data quality checks schema validation and error handling. Support batch and near real-time data flows for operational and analytics use cases. Work with Boomi and WFM teams to ensure data contracts and canonical models are enforced. Desirable Skills: Skills: Digital : Databricks Experience Required: 8-10 |
View more
View less