To develop and maintain complete data architecture across several application platforms provide capability across application platforms. To design build operationalise secure and monitor data pipelines and data stores to applicable architecture solution designs standards policies and governance requirements thus making data accessible for the evaluation and optimisation for downstream use case consumption. To execute data engineering duties according to standards frameworks and roadmaps
Qualifications :
Qualification:
- Degree in STEM / Informatics / Information Technology
Experience Required
- 5-7 years experience in building databases warehouses reporting and data integration solutions. Experience building and optimising big data data-pipelines architectures and data sets.
- 5-7 years experience in creating and integrating APIs. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
- 8-10 years Deep understanding of data pipelining and performance optimisation data principles how data fits in an organisation including customers products and transactional information. Knowledge of integration patterns styles protocols and systems theory
- 8-10 years experience in database programming languages including SQL PL/SQL SPARK and or appropriate data tooling. Experience with data pipeline and workflow management tools
Additional Information :
Behavioural Competencies:
- Adopting Practical Approaches
- Articulating Information
- Checking Things
- Developing Expertise
- Documenting Facts
- Embracing Change
- Examining Information
- Interpreting Data
- Managing Tasks
- Producing Output
- Taking Action
- Team Working
Technical Competencies:
- Big Data Frameworks and Tools
- Data Engineering
- Data Integrity
- Data Quality
- IT Knowledge
- Stakeholder Management (IT)
Remote Work :
No
Employment Type :
Full-time