Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailWere seeking a skilled Python Engineer to support the migration of legacy ETL pipelines built in Pentaho Data Integration (PDI/Spoon) to modern cloud-based solutions such as Azure Data Factory (ADF). This role involves translating transformation logic validating data integrity and collaborating with cross-functional teams to ensure a smooth transition.
Note : Mandatory Experience
Required Skills
Strong proficiency in Python for data manipulation and automation.
Hands-on experience with Pentaho Data Integration (PDI)
5 7 years of experience in data engineering or ETL development.
Solid understanding of ETL/ELT concepts data warehousing and data modeling.
Experience with SQL (joins aggregations subqueries).
Familiarity with Azure Data Factory cloud storage (Blob Data Lake) and DevOps tools.
Version control using Git or Azure DevOps.
Basic scripting in PowerShell or Shell is a plus.
Key Responsibilities
Analyze and document existing Pentaho ETL jobs transformations and data flows.
Translate Pentaho logic into Python scripts and/or ADF pipeline components.
Develop and maintain scalable Python-based data processing solutions.
Validate data accuracy post-migration using automated testing and SQL queries.
Collaborate with data engineers architects and QA teams to troubleshoot issues.
Create technical documentation and participate in knowledge transfer sessions.
Qualifications
Bachelors degree in Computer Science Engineering or related field.
Prior experience in migration projects is highly desirable Soft Skills.
Detail-oriented with a meticulous approach to data validation.
Strong communication and documentation abilities.
Collaborative mindset with a proactive attitude.
Full-time