Workload: 100%
Join our dynamic team as a Data Engineer in Procurement where you will play a crucial role in building and optimizing data pipelines. Your expertise will help us transform data into actionable insights driving efficiency and innovation within our organization.
What you will do:
- Architect and build data ingestion pipelines from Microsoft Dynamics 365 Business Central SAP systems and manual sources.
- Translate business logic and data models into production-grade data pipelines and reusable transformation logic.
- Implement medallion-layered storage in Azure Data Lake Storage Gen2 utilizing Databricks and Delta Lake.
- Create robust frameworks for ETL/ELT data quality schema validation and metadata management.
- Participate in data architecture discussions and conduct trade-off analyses on performance cost scalability and maintainability.
- Support the development of analytical datasets and KPIs for tools like Power BI.
What you bring & who you are:
- 5 years of experience in data engineering and cloud-native architecture.
- Proven experience with the Azure ecosystem including Data Lake Data Factory and Databricks.
- Strong understanding of data modeling data integration and ETL best practices.
- Experience handling unstructured and semi-structured data from multiple heterogeneous sources.
- Proven experience with
- Databricks using PySpark/SQL including orchestration and optimization.
- Delta Lake and medallion architecture design patterns.
- Excellent problem-solving skills attention to detail and ability to work independently in a greenfield environment.
- Very good command of English; German is a strong plus.
- Familiarity with CI/CD and Databricks Asset Bundles for data pipelines is a plus.
About the team:
Our team is dedicated to leveraging data to drive strategic decisions and enhance operational efficiency. We foster a collaborative environment where innovation thrives and every team members contribution is valued. Join us to make a significant impact in the procurement domain.