To design code verify test document amend and secure data pipelines and data stores according to agreed architecture solution designs standards policies and governance requirements. To monitor and report on own progress and proactively identify issues related to data engineering activities. To collaborate in reviews of work with others where appropriate
Qualifications :
Qualification:
- Field of Study: Computer Science or similar
Experience Required
- 5-7 years Experience in building databases warehouses reporting and data integration solutions.
- Experience in AWS / Azure
- Experience building and optimising big data data-pipelines architectures and data sets.
- Experience in creating and integrating APIs (C# or Python etc)
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
- 5-7 years Experience in database programming languages including SQL PL/SQL and or appropriate data tooling.
- Experience with data pipeline (SSIS)
- 5-7 years Understanding of data pipelining and performance optimisation data principles how data fits in an organisation including customers products and transactional information. Knowledge of integration patterns styles protocols and systems theory
Additional Information :
Behavioural Competencies:
- Adopting Practical Approaches
- Articulating Information
- Documenting Facts
- Examining Information
- Interpreting Data
- Managing Tasks
- Producing Output
Technical Competencies:
- Big Data Frameworks and Tools
- Data Engineering
- Data Integrity
- IT Knowledge
- Stakeholder Management (IT)
Remote Work :
No
Employment Type :
Full-time