Urgent requirement of DataBricks Developer - Contract - Sydney/Melbourne
Requirements
- Design and development of ELT pipelines using Azure Data Factory and Databricks
- Optimisation and maintenance of data workflows ensuring quality and integrity
- Performance tuning and monitoring
- Data Modelling / Power BI semantic modelling
- Notebook development using Python SQL and/or PySpark
- Implementation of best practices for data engineering including governance and security
Good to have :
- Extensive experience building data pipelines in a Databricks Lakehouse environment
- Well-versed with Spark and other big data technologies
- Excellent coding skills using SQL and Python
- Strong background in developing solutions in an Azure Cloud environment including strong ADF development skills
- Data Modelling skills
Responsibility of / Expectations from the Role :
- On-call troubleshooting experience
- Provide support for escalated cases
- To groom new joiners to the team and take them through the support process
- Identify problem areas and provide relevant solutions
- Prepare and contribute to build knowledge areas
Duration: 06 months and possible extension
Eligibility: Australian/NZ Citizens/PR Holders only
Required Skills:
DataBricks Developer - Contract - Sydney/Melbourne
Required Education:
BE
Urgent requirement of DataBricks Developer - Contract - Sydney/MelbourneRequirementsDesign and development of ELT pipelines using Azure Data Factory and DatabricksOptimisation and maintenance of data workflows ensuring quality and integrityPerformance tuning and monitoringData Modelling / Power BI s...
Urgent requirement of DataBricks Developer - Contract - Sydney/Melbourne
Requirements
- Design and development of ELT pipelines using Azure Data Factory and Databricks
- Optimisation and maintenance of data workflows ensuring quality and integrity
- Performance tuning and monitoring
- Data Modelling / Power BI semantic modelling
- Notebook development using Python SQL and/or PySpark
- Implementation of best practices for data engineering including governance and security
Good to have :
- Extensive experience building data pipelines in a Databricks Lakehouse environment
- Well-versed with Spark and other big data technologies
- Excellent coding skills using SQL and Python
- Strong background in developing solutions in an Azure Cloud environment including strong ADF development skills
- Data Modelling skills
Responsibility of / Expectations from the Role :
- On-call troubleshooting experience
- Provide support for escalated cases
- To groom new joiners to the team and take them through the support process
- Identify problem areas and provide relevant solutions
- Prepare and contribute to build knowledge areas
Duration: 06 months and possible extension
Eligibility: Australian/NZ Citizens/PR Holders only
Required Skills:
DataBricks Developer - Contract - Sydney/Melbourne
Required Education:
BE
View more
View less