drjobs Data Engineer

Data Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Krakow - Poland

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

At ABB we help industries outrun - leaner and cleaner. Here progress is an expectation - for you your team and the world. As a global market leader well give you what you need to make it happen. It wont always be easy growing takes grit. But at ABB youll never run alone. Run what runs the world.

This Position reports to:

R&D Team Manager


ABBs Service Division partners with our customers to improve the availability reliability predictability and sustainability of electrical products and installations. The Divisions extensive service portfolio offers product care modernization and advisory services to improve performance extend equipment lifetime and deliver new levels of operational and sustainable efficiency. We help customers keep resources in use for as long as possible extracting the maximum value from them and then recovering and regenerating products and materials at the end of their useful life.

We are seeking a skilled and detail-oriented Data Engineer to design and implement robust data infrastructure solutions that enable advanced analytics and AI-driven insights for industrial asset management. This role involves building scalable data pipelines using Microsoft Fabric to consolidate transform and model data from multiple heterogeneous sources. The primary objective is to provide reliable efficient and scalable access to high-quality data that supports predictive maintenance analytics risk assessment models and strategic decision-making. You will be responsible for creating the data foundation that empowers data scientists and analysts to deliver actionable insights for optimizing maintenance strategies and enhancing operational efficiency.

The work model for the role is: hybrid #LI-hybrid

You will be mainly accountable for:

  • Design develop and maintain ETL/ELT pipelines in Microsoft Fabric for ingesting and transforming data from various sources including REST APIs SQL Server MuleSoft middleware Snowflake and file data sources (JSON CSV Excel etc.).

  • Implement and manage dataflows data pipelines and Lakehouse models in Fabric to support advanced analytics and AI model development.

  • Develop and optimize data processing logic using PySpark within Microsoft Fabric notebooks for complex transformations and large-scale data processing tasks.

  • Build and maintain domain-driven data models that support analytics reporting self-service BI and machine learning workflows.

  • Ensure data quality integrity and security across the entire data lifecycle implementing robust data governance practices.

  • Collaborate with data scientists analysts software architects and business stakeholders to understand requirements and deliver fit-for-purpose data solutions.

  • Monitor and troubleshoot pipeline performance apply best practices in data architecture and performance optimization and implement improvements as needed.

  • Document data processes models and technical decisions to ensure knowledge transfer and maintainability.

Qualifications for the role:

  • Advanced degree in Computer Science Engineering Data Science or a related field (Masters preferred).

  • Proven experience (preferably 3 years) as a Data Engineer with demonstrated expertise in building production-grade data pipelines and hands-on experience with Microsoft Fabric (Data Factory Lakehouse Dataflows).

  • Strong knowledge of ETL/ELT concepts data pipeline design and experience integrating data from diverse sources including APIs databases (SQL Server) Snowflake MuleSoft and semi-structured formats.

  • Proficiency in SQL and Python with experience in data processing frameworks and modern software development practices (Git CI/CD automated testing).

  • Familiarity with data modeling data warehousing domain-driven design and experience with cloud platforms ideally Azure.

  • Knowledge of data governance principles Power BI semantic modeling Delta Lake or Synapse Analytics (preferred).

  • Experience with industrial data sources time-series data and IoT data streams.

We value people from different backgrounds. Could this be your story Apply today or visit to read more about us and learn about the impact of our solutions across the globe.

Employment Type

Full-Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.