Description:
Position Overview:
We are hiring a highly skilled and proactive ETL Developer to join our team in supporting enterprise-wide data modernization and ongoing data operations. This role focuses on designing developing and maintaining scalable ETL pipelines using Azure Data Factory (ADF) to enable continuous data integration transformation and delivery across environments. You will collaborate with technical and business stakeholders to ensure data flows are secure performant and audit-ready supporting both strategic initiatives and day-to-day operations.
Key Responsibilities:
- Design build and maintain ETL pipelines using Azure Data Factory (ADF) to support secure and efficient Oracle-to-Azure data migration.
- Lead efforts in data cleansing transformation and field mapping to ensure high-quality audit-ready data.
- Deploy and validate ETL processes across multiple environments (Dev Test UAT Prod).
- Troubleshoot and optimize ETL jobs implement rollback strategies and ensure performance at scale.
- Document processes data models exception handling logic and support runbooks for ongoing operations.
- Collaborate with internal stakeholders including data analysts solution architects and business users.
Key Deliverables:
- Production-ready ETL pipelines that meet enterprise architecture and security standards
- Support in the documentation of transformation logic field mappings and exception rules in collaboration with business and data teams
- Successfully validated data migrations with audit trail capabilities and performance metrics
- Tested and deployable ETL scripts and configuration artifacts to support production readiness
Requirements
Experience and Skill Set Requirements:
Must Have:
- Experience design build and maintain ETL pipelines using Azure Data Factory (ADF) to support secure and efficient data migration between different databases.
- Experience in data cleansing transformation and field mapping to ensure high-quality audit-ready data.
- Proven hands-on experience with Azure Data Factory (ADF) including pipeline orchestration and transformation flows.
- Experience with CI/CD deployment automation for ETL artifacts
Skill Set Requirements:
Core Qualifications:
- 10 years of experience in ETL development and data migration preferably in complex or regulated environments.
- Strong expertise with Oracle databases and Azure SQL.
- Proven hands-on experience with Azure Data Factory (ADF) including pipeline orchestration and transformation flows.
- In-depth understanding of data quality auditability schema mapping and rollback planning.
Technical Proficiency:
- Advanced SQL (PL/SQL T-SQL) ADF Data Flows Triggers
- Tools: PowerDesigner Excel Azure DevOps Git
- Familiarity with transformation formats such as Parquet and knowledge of incremental loads CDC strategies
- Experience with CI/CD deployment automation for ETL artifacts
Nice to Have:
- Experience working on public sector or regulated industry data projects
- Knowledge of data governance privacy and retention policies
- Exposure to other ETL or orchestration platforms (AWS Glue Informatica GCP Dataflow)
Experience and Skill Set Requirements: Must Have: Experience design, build, and maintain ETL pipelines using Azure Data Factory (ADF) to support secure and efficient data migration between different databases. Experience in data cleansing, transformation, and field mapping to ensure high-quality, audit-ready data. Proven hands-on experience with Azure Data Factory (ADF) including pipeline orchestration and transformation flows. Experience with CI/CD deployment automation for ETL artifacts Skill Set Requirements: Core Qualifications: 10+ years of experience in ETL development and data migration, preferably in complex or regulated environments. Strong expertise with Oracle databases and Azure SQL. Proven hands-on experience with Azure Data Factory (ADF) including pipeline orchestration and transformation -depth understanding of data quality, auditability, schema mapping, and rollback planning. Technical Proficiency: Advanced SQL (PL/SQL, T-SQL), ADF Data Flows, Triggers Tools: PowerDesigner, Excel, Azure DevOps, Git Familiarity with transformation formats such as Parquet and knowledge of incremental loads, CDC strategies Experience with CI/CD deployment automation for ETL artifacts Nice to Have: Experience working on public sector or regulated industry data projects Knowledge of data governance, privacy, and retention policies Exposure to other ETL or orchestration platforms (AWS Glue, Informatica, GCP Dataflow)