Key Responsibilities
- Build and maintain ETL/ELT pipelines using Azure Data Bricks and PySpark.
- Transform clean and structure data for analytics and reporting.
- Manage Azure Data Lake for data storage and optimization.
- Leverage Azure Event Hub for realtime data ingestion.
- Collaborate with stakeholders to understand data requirements and implement solutions.
- Support Agile team workflows to ensure timely delivery of data products.
- Integrate CI/CD practices with tools such as Azure Pipelines or GitHub Actions.
- Automate deployment processes for data pipelines and solutions.
Required Skills and Qualifications:
23 years of experience in Azure Data Engineering.
Proficiency in Azure Data Bricks and Data Lake.
Strong knowledge of Python and PySpark for data transformation.
Familiarity with Azure Event Hub and Key Vault.
Experience with CI/CD tools like Azure Pipelines or GitHub Actions.
Agile development experience is a plus.
Strong problemsolving and analytical skills.
Ability to work in a collaborative and dynamic team environment.
Excellent communication skills for crossfunctional coordination.
agile development,azure,databricks,github actions,ci/cd tools,azure data engineering,azure data bricks,azure pipelines,python,problem-solving,pyspark,azure event hub,analytical skills,communication skills,key vault,data lake,azure data factory