Responsible for the design and implementation of reusable and optimized end-to-end data pipelines partnering with internal technical and business stakeholders. You will work closely with the business teams to gather functional requirements develop data models & design proposals implement and test solutions.
Tasks & duties:
- Design and develop robust data pipelines from various source systemsincluding SAP ERP and cloud-based SaaS platforms - into our Snowflake data warehouse
- Translate business requirements into technical solutions understanding the full business context and how data flows across systems
- Build transformation logic in dbt creating modular testable and well-documented models that support analytics and reporting
- Model data using best practices (star/snowflake schema) ensuring clarity consistency and performance across the BI landscape
- Collaborate with BI analysts and business stakeholders to ensure data models reflect real-world processes and support decision-making
- Extend and optimize our data integration framework focusing on scalability reusability and automation
- Debug optimize and refactor existing pipelines ensuring reliability and performance
- Implement automated testing and data quality checks to maintain trust in our data assets
Qualifications :
- Solid experience with data warehouse and lake architecture using Snowflake and Azure Data Lake Storage
- Strong command of data modeling techniques (e.g. Inmon Kimball Data Vault etc.).
- Deep expertise in dbt SQL and Snowflake (or similar cloud data platforms)
- Strong understanding of data architecture ETL/ELT design and data warehouse principles
- Ability to grasp complex business processes and translate them into clean scalable data models.
- Data Ingestion & Processing (Azure Data Factory Functions / Batch Ingestion Stream Analytics etc)
- Experience with DevOps (CI/CD version control testing automation workflow orchestration)
- Proactive mindset strong problem-solving skills and the ability to work independently and lead technical initiatives.
Additional Information :
medmix is an equal opportunity employer committed to the strength of a diverse workforce.
93% of our employees would go above and beyond to deliver results do you have the drive to succeed Join us and boost your career starting today!
Remote Work :
No
Employment Type :
Full-time
Responsible for the design and implementation of reusable and optimized end-to-end data pipelines partnering with internal technical and business stakeholders. You will work closely with the business teams to gather functional requirements develop data models & design proposals implement and test so...
Responsible for the design and implementation of reusable and optimized end-to-end data pipelines partnering with internal technical and business stakeholders. You will work closely with the business teams to gather functional requirements develop data models & design proposals implement and test solutions.
Tasks & duties:
- Design and develop robust data pipelines from various source systemsincluding SAP ERP and cloud-based SaaS platforms - into our Snowflake data warehouse
- Translate business requirements into technical solutions understanding the full business context and how data flows across systems
- Build transformation logic in dbt creating modular testable and well-documented models that support analytics and reporting
- Model data using best practices (star/snowflake schema) ensuring clarity consistency and performance across the BI landscape
- Collaborate with BI analysts and business stakeholders to ensure data models reflect real-world processes and support decision-making
- Extend and optimize our data integration framework focusing on scalability reusability and automation
- Debug optimize and refactor existing pipelines ensuring reliability and performance
- Implement automated testing and data quality checks to maintain trust in our data assets
Qualifications :
- Solid experience with data warehouse and lake architecture using Snowflake and Azure Data Lake Storage
- Strong command of data modeling techniques (e.g. Inmon Kimball Data Vault etc.).
- Deep expertise in dbt SQL and Snowflake (or similar cloud data platforms)
- Strong understanding of data architecture ETL/ELT design and data warehouse principles
- Ability to grasp complex business processes and translate them into clean scalable data models.
- Data Ingestion & Processing (Azure Data Factory Functions / Batch Ingestion Stream Analytics etc)
- Experience with DevOps (CI/CD version control testing automation workflow orchestration)
- Proactive mindset strong problem-solving skills and the ability to work independently and lead technical initiatives.
Additional Information :
medmix is an equal opportunity employer committed to the strength of a diverse workforce.
93% of our employees would go above and beyond to deliver results do you have the drive to succeed Join us and boost your career starting today!
Remote Work :
No
Employment Type :
Full-time
View more
View less