DataOps Engineer (m/f/x)
477739
02. Open
Experienced
IT-Jobs: Data & Analytics
ALDI Data & Analytics Services GmbHMintarder Strae36-4045481Mlheim an der Ruhr
Mintarder Strae36-40
Mlheim an der Ruhr
45481
Germany (DE)
or part-time
06/11/2025
permanent contract
partial remote working option - up to 60% anywhere in Germany
Platforms & Architecture
Laura Reetz
TopJobJul25
5
IT-Jobs
Senior Expert
Nordrhein-Westfalen (DE-NW)
11.06.2025 12:43
None
Info text
At ALDI DX we develop innovative digital products and services for our employees as well as our customers in 11 ALDI SD countries and over 7300 ALDI SD stores worldwide. We drive digital value to offer great quality at the lowest price.
We will be guided along the way by the three core values of the ALDI SD Group simplicity reliability and responsibility. Our team and our performance are also at the heart of everything we do at ALDI DX.
Your Job
What you give your best for.
- Monitoring restarting analysing fixing and improving existing data pipelines between source systems and the data lake in both directions
- Communicating the impact of service degradations with data lake user community and internal service management team
- Handling incident and problem management for the team
- Observing controlling and optimising the cluster configuration (i.e. setup version credentials) in collaboration with the cloud team
- Developing and maintaining squad-specific data architecture and pipelines that adhere to defined ETL and data lake principles
- Solving technical data problems that help the business area achieve its goals
- Proposing and contributing to education and improvement plans for IT operations capabilities standards tools and processes
Your Profile
What you should have.
- Background in computer science
- Three years of experience in an IT operations role working with solutions in distributed computing big data and advanced analytics
- Expertise in SQL data analysis and at least one programming language (e.g. Python)
- Understanding of database administration ideally using Databricks/Spark and SQL Server DB as well as knowledge of relational NoSQL and cloud database technologies
- Proficiency in distributed computing and the underlying concepts preferably Spark and MapReduce
- Familiarity with Microsoft Azure tools e.g. Azure Data Factory Azure Databricks Azure Event Hub
- Operational knowledge of ETL scheduling reporting tools data warehousing as well as structured and unstructured data
- Familiarity with the Unix operating system especially shell scripting
- Basic understanding of network level problems and connectivity requirements
- Excellent communication skills and business fluency in English; knowledge of German is a plus
Your Benefits
How we value your work.
- Partial mobile working within Germany
- State-of-the-art technologies
- Attractive remuneration as well as holiday and Christmas bonuses
- Future-oriented training and development
- Modular onboarding and buddy
- Health activities
Your Tech Stack
What you work with among other things.
- Python
- PySpark
- ServiceNow
- M365
- Many more depending on the job