To develop and maintain complete data architecture across several application platforms provide capability across application platforms. To design build operationalise secure and monitor data pipelines and data stores to applicable architecture solution designs standards policies and governance requirements thus making data accessible for the evaluation and optimisation for downstream use case consumption. To execute data engineering duties according to standards frameworks and roadmaps
Qualifications :
- Information Studies or Information Technology
- Basic cloud certificates - DP-203 or DP-700 DP-900
Experience:
- Data modelling & warehousing experience.
- Data pipelines experience - Extract Transform and Load.
- Knowledge and experience in the following tools - SSIS SSDT Power BI MS SQL Python.
- Understanding of data quality issues and recon frameworks.
- Performance optimisation or code reverse engineering experience.
- Knowledge of setting up and using DevOps for deployments or at least better understanding of it.
- Azure Cloud Experience.
- Azure data engineering tools (Optional) e.g. ADF Synapse Analytics Studio or MS Fabric
Additional Information :
Behavioral Competencies:
- Adopting Practical Approaches
- Articulating Information
- Checking Details
- Developing Expertise
- Documenting Facts
- Embracing Change
- Examining Information
- Interpreting Data
- Managing Tasks
- Producing Output
- Taking Action
- Team Working
Technical Competencies:
- Big Data Frameworks and Tools
- Data Engineering
- Data Integrity
- Data Quality
- IT Knowledge
- Stakeholder Management (IT)
Remote Work :
No
Employment Type :
Full-time
To develop and maintain complete data architecture across several application platforms provide capability across application platforms. To design build operationalise secure and monitor data pipelines and data stores to applicable architecture solution designs standards policies and governance requ...
To develop and maintain complete data architecture across several application platforms provide capability across application platforms. To design build operationalise secure and monitor data pipelines and data stores to applicable architecture solution designs standards policies and governance requirements thus making data accessible for the evaluation and optimisation for downstream use case consumption. To execute data engineering duties according to standards frameworks and roadmaps
Qualifications :
- Information Studies or Information Technology
- Basic cloud certificates - DP-203 or DP-700 DP-900
Experience:
- Data modelling & warehousing experience.
- Data pipelines experience - Extract Transform and Load.
- Knowledge and experience in the following tools - SSIS SSDT Power BI MS SQL Python.
- Understanding of data quality issues and recon frameworks.
- Performance optimisation or code reverse engineering experience.
- Knowledge of setting up and using DevOps for deployments or at least better understanding of it.
- Azure Cloud Experience.
- Azure data engineering tools (Optional) e.g. ADF Synapse Analytics Studio or MS Fabric
Additional Information :
Behavioral Competencies:
- Adopting Practical Approaches
- Articulating Information
- Checking Details
- Developing Expertise
- Documenting Facts
- Embracing Change
- Examining Information
- Interpreting Data
- Managing Tasks
- Producing Output
- Taking Action
- Team Working
Technical Competencies:
- Big Data Frameworks and Tools
- Data Engineering
- Data Integrity
- Data Quality
- IT Knowledge
- Stakeholder Management (IT)
Remote Work :
No
Employment Type :
Full-time
View more
View less