Overview
Roles and Responsibilities
- Build ETL & data pipelines using Azure Data Factory & Databricks to help feed the data into data products/dashboards
- Automate processes and workflows to drive efficiencies for client
- Liaise with different stakeholders on adhoc analyses and monitor the entire DWH/Data Lake
- Working with stakeholders to gather requirements provide efficient data solutions and designing the build
- Use best practices to deliver results efficiency and quality for data and visualization requirements
- Collaborate and support the analytics team to help them understand the data flow
Desired Profile
- Design data pipelines with Azure services including Azure Data Factory and Databricks
- Strong expertise and experience on transforming data using PySpark scripts and SQL queries
- Creating data models/data objects in Azure Synapse/DWH/DB supporting BI/client reporting
- Understanding of data architecture data modeling DWH and ELT/ETL concepts
Good to have Skills
- Scripting languages like Python
- Exposure in big data technologies
- Exposure in BI tools (e.g. Tableau Qlik)
Basic Qualifications
BE/ in Computer Science/ Information Technolog
Qualifications :
MCA B.E. or Equivalent
Remote Work :
No
Employment Type :
Fulltime