As a Data Engineer you will be responsible for:
- Building and maintaining processes for collecting data from various sources into the Data Lake.
- Designing developing and optimizing complex data pipelines to ensure reliable data flow.
- Creating developing and maintaining frameworks that facilitate the construction of data pipelines.
- Implementing comprehensive testing frameworks for data pipelines.
- Collaborating with analysts and data scientists to ensure the delivery of high-quality data.
- Ensuring robust data management security and compliance practices.
- Researching and implementing new technologies to improve data pipeline performance.
- Leveraging and integrating data from various types of source systems including Kafka MQ SFTP databases APIs and file shares.
Qualifications :
- Proficiency in cloud platforms and services- GCP.
- Bachelors or Masters degree in Data Science Statistics Computer Science Economics or a related field.
- Minimum 3 years of experience as a Data Analyst or Data Quality Analyst in a data-driven organization.
- Proven experience in data quality management and data governance practices.
- Strong database expertise including advanced SQL and PL/SQL.
- Proficiency in Python with experience in Scala as a plus.
- Experience with Linux and bash scripting.
- Solid understanding of Cloudera Hadoop technology stack (Apache Spark Apache Kafka).
- Knowledge of data handling principles including ETL processes and real-time data processing.
Nice to have:
- Experience with CI/CD pipelines and automation tools.
- Strong understanding of data governance principles including metadata management and data quality frameworks.
- Ability to work with and understand various source system types (Kafka MQ SFTP databases APIs file shares).
Additional Information :
- Hybrid work- 3 times a week from the office in Warsaw.
- We hereby inform you that Inetum Polska sp. z o.o. has implemented an internal reporting (whistleblowing) procedure. The content of the procedure and the possibility to submit an internal report are available at: Work :
No
Employment Type :
Full-time
As a Data Engineer you will be responsible for: Building and maintaining processes for collecting data from various sources into the Data Lake.Designing developing and optimizing complex data pipelines to ensure reliable data flow.Creating developing and maintaining frameworks that facilitate the co...
As a Data Engineer you will be responsible for:
- Building and maintaining processes for collecting data from various sources into the Data Lake.
- Designing developing and optimizing complex data pipelines to ensure reliable data flow.
- Creating developing and maintaining frameworks that facilitate the construction of data pipelines.
- Implementing comprehensive testing frameworks for data pipelines.
- Collaborating with analysts and data scientists to ensure the delivery of high-quality data.
- Ensuring robust data management security and compliance practices.
- Researching and implementing new technologies to improve data pipeline performance.
- Leveraging and integrating data from various types of source systems including Kafka MQ SFTP databases APIs and file shares.
Qualifications :
- Proficiency in cloud platforms and services- GCP.
- Bachelors or Masters degree in Data Science Statistics Computer Science Economics or a related field.
- Minimum 3 years of experience as a Data Analyst or Data Quality Analyst in a data-driven organization.
- Proven experience in data quality management and data governance practices.
- Strong database expertise including advanced SQL and PL/SQL.
- Proficiency in Python with experience in Scala as a plus.
- Experience with Linux and bash scripting.
- Solid understanding of Cloudera Hadoop technology stack (Apache Spark Apache Kafka).
- Knowledge of data handling principles including ETL processes and real-time data processing.
Nice to have:
- Experience with CI/CD pipelines and automation tools.
- Strong understanding of data governance principles including metadata management and data quality frameworks.
- Ability to work with and understand various source system types (Kafka MQ SFTP databases APIs file shares).
Additional Information :
- Hybrid work- 3 times a week from the office in Warsaw.
- We hereby inform you that Inetum Polska sp. z o.o. has implemented an internal reporting (whistleblowing) procedure. The content of the procedure and the possibility to submit an internal report are available at: Work :
No
Employment Type :
Full-time
View more
View less