drjobs Data Engineer/Scientist (Advanced) - Gauteng/Hybrid - ISB1501210

Data Engineer/Scientist (Advanced) - Gauteng/Hybrid - ISB1501210

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Midrand - South Africa

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Are you a Data Engineer passionate about cloud-based solutions big data and designing resilient data pipelines

Join a globally integrated team shaping the future of mobility through connected vehicle technology.

Be part of building smart data systems with a high-performing agile IT powerhouse.

Global-scale data initiatives


Flexible hybrid work model


Modern cloud-native tech stack


Agile & collaborative engineering culture

POSITION: Contract until December 2027


EXPERIENCE: 4-6 Years related working experience


COMMENCEMENT: 01 August 2025

Qualifications / Experience Required
  • Bachelors or Masters degree in Computer Science Software Engineering or a related field

  • Minimum of 3 years experience as a Data Engineer

  • At least 2 years of experience working with AWS services

  • Proven experience in building and maintaining data pipelines for large-scale datasets

  • Agile working experience

Essential Skills Requirements
  • Proficiency in Python 3.x for data processing and automation

  • Experience with AWS Glue for ETL processes

  • Strong knowledge of AWS Athena for querying large datasets

  • Hands-on experience with AWS Lambda for serverless computing

  • Familiarity with AWS EC2 for scalable computing resources

  • Expertise in AWS CloudWatch for monitoring and logging

  • Proficiency in working with PostgreSQL RDS for database management

  • Experience with AWS QuickSight for data visualization and reporting

  • Strong understanding of data ingestion pipelines particularly for Call Detail Records (CDRs)

  • Proficiency in Git and GitHub for version control and collaboration including experience with GitHub CI/CD pipelines

  • Any additional responsibilities assigned in the Agile Working Model (AWM) Charter

Advantageous Skills Requirements
  • Extensive experience with Terraform for infrastructure as code

  • Experience with other AWS services such as S3 Kinesis and DynamoDB

  • Knowledge of data formats such as Parquet AVRO JSON and CSV

  • Experience with Docker for containerization

  • Understanding of Big Data technologies and frameworks

  • Familiarity with Agile working models and tools like JIRA and Confluence

  • Experience with data quality tools such as Great Expectations

  • Knowledge of REST API development and integration

  • Strong analytical skills for troubleshooting and optimizing data pipelines

  • Experience in developing technical documentation and artefacts

Role & Responsibilities
  • Design develop and maintain data ingestion pipelines for the Groups connected fleet focusing on Call Detail Records (CDRs)

  • Utilize AWS Glue for ETL processes to transform and load data into the analytics platform

  • Implement efficient querying solutions using AWS Athena

  • Develop serverless applications and workflows using AWS Lambda

  • Monitor and optimize data pipelines using AWS CloudWatch

  • Manage and maintain PostgreSQL RDS databases for data storage and retrieval

  • Create interactive dashboards and reports using AWS QuickSight

  • Leverage Terraform extensively to define deploy and manage AWS infrastructure as code

  • Use Git and GitHub for version control and collaboration

  • Implement and manage GitHub CI/CD pipelines to automate testing deployment and delivery processes

  • Ensure data security and compliance with the Groups information classification requirements

  • Stay updated with the latest data engineering tools technologies and industry trends

  • Identify opportunities for process improvements and automation

  • Collaborate with cross-functional teams to understand data requirements and deliver solutions

  • Develop and maintain technical documentation for data engineering processes and solutions

PLEASE NOTE:

By applying for this role you consent to be added to the iSanqa database and to receive updates until you unsubscribe.
Also note that if you have not received a response from us within 2 weeks your application was unsuccessful.
Candidates MUST be based in Gauteng or WILLING TO RELOCATE!

#isanqa #isanqaresourcing #DataEngineer #AWSJobs #Python #DataPipelines #ETL #BigData #TechCareersSA #NowHiring #GroupCareers #CloudEngineering #FuelledByPassionIntegrityExcellence

iSanqa is your trusted Level 2 BEE recruitment partner dedicated to continuous improvement in delivering exceptional service. Specializing in seamless placements for permanent staff temporary resources and efficient contract management and billing facilitation iSanqa Resourcing is powered by a team of professionals with an outstanding track record. With over 100 years of combined experience we are committed to evolving our practices to ensure ongoing excellence.

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.