Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailNot Disclosed
Salary Not Disclosed
1 Vacancy
Job Description
Were seeking a skilled Python Engineer to support the migration of legacy ETL pipelines built in Pentaho Data Integration (PDISpoon) to modern cloudbased solutions such as Azure Data Factory (ADF). This role involves translating transformation logic validating data integrity and collaborating with crossfunctional teams to ensure a smooth transition. Key Responsibilities Analyze and document existing Pentaho ETL jobs transformations and data flows. Translate Pentaho logic into Python scripts andor ADF pipeline components. Develop and maintain scalable Pythonbased data processing solutions. Validate data accuracy postmigration using automated testing and SQL queries. Collaborate with data engineers architects and QA teams to troubleshoot issues. Create technical documentation and participate in knowledge transfer sessions. Required Skills Strong proficiency in Python for data manipulation and automation. Handson experience with Pentaho Data Integration (PDI). Solid understanding of ETLELT concepts data warehousing and data modeling. Experience with SQL (joins aggregations subqueries). Familiarity with Azure Data Factory cloud storage (Blob Data Lake) and DevOps tools. Version control using Git or Azure DevOps. Basic scripting in PowerShell or Shell is a plus. Qualifications Bachelors degree in Computer Science Engineering or related field. 5-7 years of experience in data engineering or ETL development. Prior experience in migration projects is highly desirable. Soft Skills Detailoriented with a meticulous approach to data validation. Strong communication and documentation abilities. Collaborative mindset with a proactive attitude.
Full-time