Urgent requirement of Sr. Data Engineer - Contract - Sydney
Requirements
- Must build and optimise Fabric Lakehouse tables Warehouses Pipelines and Dataflows Gen2.
- Develop reusable ETL/ELT frameworks using PySpark SQL and Fabric notebooks.
- Manage CI/CD workflows via Azure DevOps and Git.
- Apply best practices for performance tuning cost optimisation and data quality.
- Expertise in Delta Lake Fabric Data Engineering ADLS PySpark/SQL.
- Strong understanding of data modelling and integration with Power BI datasets.
- Very Good Communication Skills.
Duration: 6 Months and possible extension
Eligibility: Australian/NZ Citizens/PR Holders only
Required Skills:
Must build and optimise Fabric Lakehouse tables Warehouses Pipelines and Dataflows Gen2. Develop reusable ETL/ELT frameworks using PySpark SQL and Fabric notebooks. Manage CI/CD workflows via Azure DevOps and Git. Apply best practices for performance tuning cost optimisation and data quality. Expertise in Delta Lake Fabric Data Engineering ADLS PySpark/SQL. Strong understanding of data modelling and integration with Power BI datasets. Very Good Communication Skills. Duration: 6 Months and possible extension Eligibility: Australian/NZ Citizens/PR Holders only Email:
Urgent requirement of Sr. Data Engineer - Contract - SydneyRequirementsMust build and optimise Fabric Lakehouse tables Warehouses Pipelines and Dataflows Gen2.Develop reusable ETL/ELT frameworks using PySpark SQL and Fabric notebooks.Manage CI/CD workflows via Azure DevOps and Git.Apply best practic...
Urgent requirement of Sr. Data Engineer - Contract - Sydney
Requirements
- Must build and optimise Fabric Lakehouse tables Warehouses Pipelines and Dataflows Gen2.
- Develop reusable ETL/ELT frameworks using PySpark SQL and Fabric notebooks.
- Manage CI/CD workflows via Azure DevOps and Git.
- Apply best practices for performance tuning cost optimisation and data quality.
- Expertise in Delta Lake Fabric Data Engineering ADLS PySpark/SQL.
- Strong understanding of data modelling and integration with Power BI datasets.
- Very Good Communication Skills.
Duration: 6 Months and possible extension
Eligibility: Australian/NZ Citizens/PR Holders only
Required Skills:
Must build and optimise Fabric Lakehouse tables Warehouses Pipelines and Dataflows Gen2. Develop reusable ETL/ELT frameworks using PySpark SQL and Fabric notebooks. Manage CI/CD workflows via Azure DevOps and Git. Apply best practices for performance tuning cost optimisation and data quality. Expertise in Delta Lake Fabric Data Engineering ADLS PySpark/SQL. Strong understanding of data modelling and integration with Power BI datasets. Very Good Communication Skills. Duration: 6 Months and possible extension Eligibility: Australian/NZ Citizens/PR Holders only Email:
View more
View less