Join a cutting-edge initiative focused on predictive maintenance for jet engines. Youll be part of a dynamic team developing scalable data solutions that power real-time analytics and decision-making in aviation technology.
Key Responsibilities:
- Reporting and Visualisation (Power BI)
- Designing and developing interactive reports and dashboards in Power BI transforming raw data into understandable business insights.
- Using Power Query (M) for advanced data transformation in the Power BI model.
- Creating advanced calculations measures and key performance indicators (KPIs) using DAX.
- Data Engineering (Databricks ETL/ELT) developing and implementing an ETL process that transforms data from a data warehouse into a form that enables Power BI reports to function (medallion architecture).
- Creating additional tables for reporting purposes.
- Developing and optimising SQL queries and stored procedures to feed BI reports.
- Building and maintaining data pipelines (Databricks Job) using Databricks and PySpark technologies to process large volumes of data in real time or in batches.
- Managing data structures in Data Lake ensuring quality consistency and storage optimisation.
- Designing and implementing efficient ET
Qualifications :
- Analytics and Visualisation in Power BI - Desktop and Service (reports dashboards DAX Power Query/M).
- Databricks Data Engineering (PySpark/Scala/SQL notebooks Data Lake) - practical experience in building Databricks jobs.
- Experience in writing AzureDevOps pipelines.
- Design and implementation of ETL/ELT processes (knowledge of patterns optimisation).
- SQL (including SparkSQL T-SQL).
- Data modelling (Dimension/Fact Star/Snowflake Schema).
- Creating measures and calculated columns (DAX).
- Python programming (especially in the context of Data Engineering e.g. Pandas PySpark).
Additional Information :
Hybrid work 2 days per week at the office in Warsaw Katowice Poznań Lublin Rzeszow or Lodz.
Remote Work :
No
Employment Type :
Full-time
Join a cutting-edge initiative focused on predictive maintenance for jet engines. Youll be part of a dynamic team developing scalable data solutions that power real-time analytics and decision-making in aviation technology.Key Responsibilities:Reporting and Visualisation (Power BI)Designing and deve...
Join a cutting-edge initiative focused on predictive maintenance for jet engines. Youll be part of a dynamic team developing scalable data solutions that power real-time analytics and decision-making in aviation technology.
Key Responsibilities:
- Reporting and Visualisation (Power BI)
- Designing and developing interactive reports and dashboards in Power BI transforming raw data into understandable business insights.
- Using Power Query (M) for advanced data transformation in the Power BI model.
- Creating advanced calculations measures and key performance indicators (KPIs) using DAX.
- Data Engineering (Databricks ETL/ELT) developing and implementing an ETL process that transforms data from a data warehouse into a form that enables Power BI reports to function (medallion architecture).
- Creating additional tables for reporting purposes.
- Developing and optimising SQL queries and stored procedures to feed BI reports.
- Building and maintaining data pipelines (Databricks Job) using Databricks and PySpark technologies to process large volumes of data in real time or in batches.
- Managing data structures in Data Lake ensuring quality consistency and storage optimisation.
- Designing and implementing efficient ET
Qualifications :
- Analytics and Visualisation in Power BI - Desktop and Service (reports dashboards DAX Power Query/M).
- Databricks Data Engineering (PySpark/Scala/SQL notebooks Data Lake) - practical experience in building Databricks jobs.
- Experience in writing AzureDevOps pipelines.
- Design and implementation of ETL/ELT processes (knowledge of patterns optimisation).
- SQL (including SparkSQL T-SQL).
- Data modelling (Dimension/Fact Star/Snowflake Schema).
- Creating measures and calculated columns (DAX).
- Python programming (especially in the context of Data Engineering e.g. Pandas PySpark).
Additional Information :
Hybrid work 2 days per week at the office in Warsaw Katowice Poznań Lublin Rzeszow or Lodz.
Remote Work :
No
Employment Type :
Full-time
View more
View less