Local Candidates Only .
Key Responsibilities
- Design and implement ETL processes usingAzure Data FactoryData Flows andMapping Data Flows.
- Integrate data from various sources including SQL Server Blob Storage APIs and third-party systems.
- Develop and maintaindata pipelinesanddata ware housesin Azure.
- Monitor and troubleshoot ETL jobs to ensure data quality and performance.
- Collaborate with analysts and business stakeholders to understand data requirements.
- Implement data governance security and compliance best practices.
- Automate data workflows and optimize performance for large-scale data processing.
| | Required / Desired | | |
| Bachelors degree in Computer Science Information Systems or related field. | Required | | |
| 3 years of experience in ETL development with at least 1 year using Azure Data Factory. | Required | | |
| Proficiency in SQL Azure Synapse Azure Data Lake and Azure Functions (minimum of 5 years experience) | Required | | |
| Experience with CI/CD pipelines Git and DevOps practices (minimum of 5 years experience). | Required | | |
| Strong understanding of data modeling data warehousing and cloud architecture (minimum of 5 years experience). | Required | | |
| Microsoft Azure certifications (e.g. DP-203 AZ-900). | Required | | |
| Experience with Power BI Databricks or Python for data transformation. | Required | | |
| Familiarity with Agile methodologies and tools like JIRA or Azure Boards (minimum of 5 years experience). | Required | | |
Local Candidates Only .Key Responsibilities Design and implement ETL processes usingAzure Data FactoryData Flows andMapping Data Flows. Integrate data from various sources including SQL Server Blob Storage APIs and third-party systems. Develop and maintaindata pipelinesanddata ware housesin Azure....
Local Candidates Only .
Key Responsibilities
- Design and implement ETL processes usingAzure Data FactoryData Flows andMapping Data Flows.
- Integrate data from various sources including SQL Server Blob Storage APIs and third-party systems.
- Develop and maintaindata pipelinesanddata ware housesin Azure.
- Monitor and troubleshoot ETL jobs to ensure data quality and performance.
- Collaborate with analysts and business stakeholders to understand data requirements.
- Implement data governance security and compliance best practices.
- Automate data workflows and optimize performance for large-scale data processing.
| | Required / Desired | | |
| Bachelors degree in Computer Science Information Systems or related field. | Required | | |
| 3 years of experience in ETL development with at least 1 year using Azure Data Factory. | Required | | |
| Proficiency in SQL Azure Synapse Azure Data Lake and Azure Functions (minimum of 5 years experience) | Required | | |
| Experience with CI/CD pipelines Git and DevOps practices (minimum of 5 years experience). | Required | | |
| Strong understanding of data modeling data warehousing and cloud architecture (minimum of 5 years experience). | Required | | |
| Microsoft Azure certifications (e.g. DP-203 AZ-900). | Required | | |
| Experience with Power BI Databricks or Python for data transformation. | Required | | |
| Familiarity with Agile methodologies and tools like JIRA or Azure Boards (minimum of 5 years experience). | Required | | |
View more
View less