Data Engineer
Job Summary
We are seeking a Data Engineer position based in our Poznań location to join the Data Science & Engineering team.
Ready to help build a better future for generations to come
In an ever-changing world we owe it to ourselves and our future generations to live life responsibly. At ROCKWOOL we work with dedication to enrich modern living through our innovative stone wool solutions.
Join us and make a meaningful difference!
Your future team:
You will join our Data Science & Engineering Team a group of 14 skilled professionals including the Team Leader. The team combines strong expertise in data engineering analytics and machine learning and is structured into several projectfocused subteams working across a variety of business areas.
What we are building:
Our data platform is already there in cloud already in Databricks. But were not here to maintain the status quo were rebuilding it from the ground up to jump into exciting world of real-time data and streaming.
We will migrate from a batch-oriented Airflow Databricks to a streaming-first architecture: Kafka Databricks with new cool features like Declarative Pipelines Unity Catalog Apache Iceberg / Polaris Catalog and new serving layer which you will help us to select.
This is a greenfield build inside a global company real budget real data real stakes. No startup chaos but real room to make meaningful architectural decisions.
What you will be doing:
Youll be the go-to Databricks expert on the team. Youll have important role in the migration from the legacy stack while designing and building the new platform in parallel and in parallel is doing real work in that sentence.
The legacy platform runs. Not beautifully but it runs and it serves real business needs that cant wait for the migration to finish. Youll split your time betweenkeeping it stable(and gradually less painful) andbuilding its replacement. If the idea of legacy firefighting makes you want to close this tab this probably isnt the right role. If you see it as part of the job and take quiet satisfaction in fixing things that are broken read on.
You will work on:
- Design & build the new streaming platform (Kafka Databricks with Declarative Pipelines)
- Migrate existing batch workflows from Airflow Docker onprem Databricks to cloudnative architecture
- Keep the current platform stable while improving its reliability performance and operability
- Architect the serving layer
- Govern data properly Unity Catalog lineage access control data quality not as an afterthought
- Enable sharing across organization with Polaris and Iceberg
- Collaborate with data scientists ML engineers and business teams across regions
- Use AI tools daily we use GitHub Copilot and internal homemade assistants/agents we build on our own within a team; we expect you to help the team get real value from them
You will thrive here if you:
- Know Databricks deeply Unity Catalog Delta Live Tables / Declarative Pipelines Workflows Bundles not just used it on a project
- Have streaming experience Kafka event-driven architectures late data handling exactly-once semantics
- Have worked in consulting or client-facing roles you can communicate with business stakeholders and stay focused on outcomes
- Write production code PySpark Python SQL CI/CD (e.g. Github Actions) IaC (e.g. Terraform)
- Are comfortable with imperfect systems the legacy stack has rough edges; youll sand them down while building something better
- Dont need the work to be glamorous some weeks its streaming architecture some weeks its debugging a broken Airflow DAG
- Are genuinely curious about AI tooling Copilot/LLMs/agents are part of your workflow
What you bring:
- 3-5 years of experience in data engineering (the specific tech stack is flexible - we value your way of thinking and problemsolving above tools)
- Strong consulting mindset or experience working closely with business stakeholders with the ability to ask the right questions challenge assumptions and translate business needs into technical solutions
- Endtoend ownership approach - from designing and building solutions to monitoring improving and maintaining them
- Excellent communication and collaboration skills enabling you to work effectively across teams and influence decisions
- Degree in Computer Science/Engineering or equivalent handson experience
- Experience with Databricks Spark and streaming technologies (e.g. Kafka) - a strong plus
- Proficiency in English at a minimum B2 level spoken and written
Tech you will touch:
- New stack:
- Kafka
- Databricks Declarative Pipelines
- Unity Catalog
- Apache Iceberg
- Polaris Catalog
- incremental materialized views
- AI developer tooling
- Legacy stack:
- Apache Airflow
- Docker
- onprem compute/servers
- existing Databricks jobs and batch ETL
What we offer:
By joining our team you become a part of the people-centric work environment of a Danish company. We offer you a competitive salary permanent contract after the probation period development package team building events activity-based office in Poznans city center in the new prestigious office building Nowy Rynek. The building is recognized as a building without barriers which means that it is fully adapted to the needs of people with disabilities.
Our compensation package on employment contracts includes:
- An office-first approach: home office is available up to 2 days per week
- Adaptable Hours: start your workday anytime between 7:00 AM and 9:00 AM
- Home office subsidy
- Private Medical Care
- Multikafeteria MyBenefit
- Wellbeing program
- Extra Day Off for voluntary activities
and while in the office you can also use modern office space with beautiful view and high standard furniture bicycle parking facilities & showers chill-out rooms with PlayStation football table pool table board games subsidized canteen with delicious food & fruit.
Interested
If you recognize yourself in this profile and challenge we kindly invite you to apply with CV written in English. If you want to include a short note on how you would approach a Databricks-based streaming migration we would love to read it too.
Who we are:
We are the world leader in stone wool solutions. Founded in 1937 in Denmark we transform volcanic rock into safe sustainable products that help people and communities thrive. We are a global company with more than 12200 employees located in 40 countries with 51 manufacturing facilities all focused on one common purpose to release the natural power of stone to enrich modern living.
Sustainability is central to our business strategy. ROCKWOOL was one of the first companies to commit to actively contributing to the United Nations Sustainable Development Goals (SDGs) framework and are actively committed to 11 SDGs including SDG 14 Life Below Water. Through our partnership with the One Ocean Foundation and in connection with our sponsorship of the ROCKWOOL Denmark SailGP team we will help raise awareness around ocean health challenges in an effort to accelerate solutions to protect it.
Diverse and Inclusive Culture:
We want all our people to feel valued respected included and heard. We employ 79 different nationalities worldwide and are committed to providing equal opportunities to all employees promote diversity and work against all forms of discrimination among ROCKWOOL employees.
At ROCKWOOL you will experience a friendly team environment. Our culture is very important to fact we refer to our culture as The ROCKWOOL Way. This is the foundation in which we operate and is based upon our values of ambition responsibility integrity and efficiency.
Required Experience:
IC
Key Skills
About Company
Explore Rockfon's acoustic ceilings for unmatched sound absorption and elegant design. Ideal for indoor spaces seeking superior acoustic performance.