Senior AWS Data Engineer – AI Industrialization & MLOps (Contract) GautengHybrid ISB1501462

ISanqa Resourcing

Not Interested
Bookmark
Report This Job

profile Job Location:

Pretoria - South Africa

profile Monthly Salary: Not Disclosed
Posted on: 8 hours ago
Vacancies: 1 Vacancy

Job Summary

Our client is seeking a Data Engineer (Senior).

This role is pivotal in providing cutting-edge IT solutions to global operations while ensuring the seamless integration and management of the Data Science and Engineering applications within the Group IT Hub South Africa.

Senior-level role

AWS Data Services

AI Industrialization focus

Position Details:

  • Contract Start Date:

  • Contract End Date:

  • Location: Midrand/Menlyn/Rosslyn/Home Office rotation

  • Role Group: DevOps

  • Nationality: South African citizens / residents are preferred. Applicants with valid work permits will also be considered.

  • Nationality: 6-8 Years related experience

Product / Team Context: Spearhead the industrialization of new AI technologies and concepts by supporting the business with the implementation of AI pilot use cases.

Qualifications & Experience:

  • Bachelors degree Data Science Computer Science Software Engineering or equivalent relevant hands-on experience

  • Minimum 4 years hands-on experience in data science and/or data engineering roles including production deployments

  • Demonstrated experience working with AWS data services and building scalable data platforms and production ML solutions

Essential Skills:

  • Strong knowledge of Data Science fundamentals including statistics machine learning and data modeling

  • Proficiency in Python SQL and data processing libraries (e.g. Pandas PySpark)

  • Hands-on experience with modern data orchestration and storage technologies (e.g. Apache Airflow Snowflake AWS S3)

  • Familiarity with DevOps/MLOps principles and tools (e.g. Git Jenkins Docker Kubernetes)

  • Understanding of cloud computing concepts and experience with cloud providers (AWS preferred)

  • Experience with data visualization tools (e.g. Tableau Power BI)

  • Excellent analytical and problem-solving skills

  • Effective communication and collaboration skills

Advantageous Skills:

  • Experience with streaming technologies and real-time data processing (e.g. Kafka Kinesis)

  • Familiarity with containerisation and orchestration (Docker Kubernetes) and cloud deployment patterns

  • Experience with BI tools and data preparation for visualization platforms (e.g. Tableau)

  • Knowledge of MLOps practices: model versioning CI/CD for models monitoring and model lifecycle management

  • Familiarity with infrastructure-as-code and DevOps tooling (Terraform CloudFormation GitOps)

  • Experience with advanced data governance security practices and compliance in cloud environments

  • Experience with AI productivity and assistive tools while retaining strong ability to validate and optimise AI outputs

  • Prior exposure to Extreme Programming (XP) practices within Agile teams (pair programming test-first)

  • Experience with scripting (Bash / Shell PowerShell) for automation and operational tasks

  • Experience in technical data modelling and schema design (not drag-and-drop approaches)

  • Coaching and giving training to fellow colleagues and users when required.

  • Problem solving capabilities.

  • Strong presentation skills

  • Any additional responsibilities assigned in the Agile Working Model (AWM) Charter

Key Responsibilities:

  • Design build and maintain scalable and reliable data pipelines and architectures to support the industrialization of AI pilots

  • Implement and optimize machine learning models and algorithms into production-ready data systems

  • Collaborate with data scientists researchers and software engineers to transition AI prototypes into robust scalable solutions

  • Conduct data exploration preprocessing and feature engineering to enhance model performance and data quality

  • Monitor and troubleshoot data pipeline performance ensuring high availability and data integrity

  • Leverage AWS data services (S3 Glue Lambda RDS VPC IAM etc.) to implement robust cloud-native data solutions

  • Orchestrate data workflows schedule jobs and manage dependencies to ensure timely reliable data delivery

NB: Please note that only South African citizens or applicants with valid work permits can be considered. If you do not hear from us within two weeks please consider your application unsuccessful.

#isanqa #isanqaresourcing #fuelledbypassionintegrityexcellence #DataEngineer #AWS #Python #MLOps #DataScience #AI #TheGroup #TechCareersSA

iSanqa is your trusted Level 2 BEE recruitment partner dedicated to continuous improvement in delivering exceptional service. Specializing in seamless placements for permanent staff temporary resources and efficient contract management and billing facilitation iSanqa Resourcing is powered by a team of professionals with an outstanding track record. With over 100 years of combined experience we are committed to evolving our practices to ensure ongoing excellence.

Our client is seeking a Data Engineer (Senior). This role is pivotal in providing cutting-edge IT solutions to global operations while ensuring the seamless integration and management of the Data Science and Engineering applications within the Group IT Hub South Africa. Senior-level role A...
View more view more

Key Skills

  • IVR
  • SOAP
  • Avaya
  • Solaris
  • Cost Accounting Standards
  • Database Design
  • Hibernate
  • ITIL
  • Weblogic
  • Express.js
  • Contracts
  • ASP