Data Engineer Inhouse Logistics Innovation (Contract) GautengHybrid ISB8700228

ISanqa Resourcing

Not Interested
Bookmark
Report This Job

profile Job Location:

Midrand - South Africa

profile Monthly Salary: Not Disclosed
Posted on: 12 hours ago
Vacancies: 1 Vacancy

Job Summary

Our client is seeking an expert data engineer who transforms intralogistics operations through robust data pipelines and analytics solutions that power autonomous transport systems and robotics innovation!

Design and optimize enterprise data pipelines warehouses and analytics dashboards using Azure Data Factory Databricks and Power BI to drive efficiency in automated inhouse logistics!

The ideal candidate is a results-driven data engineering professional with 5 years of Azure data experience who combines strong SQL and dimensional modeling expertise with business acumen excels at translating requirements into actionable insights through interactive visualizations and brings passion for data governance and ETL optimization in production-critical environments!

Senior data science with Azure data services and ETL pipelines

Hybrid and remote working flexibility with 1960 flexible annual hours

Logistics innovation role with data warehousing and analytics

POSITION: Contract: 02 January 2026 31 December 2028

EXPERIENCE: 6-8 years related experience

COMMENCEMENT: 02 January 2026

LOCATION: Hybrid: Midrand/Menlyn/Rosslyn/Home Office rotation

TEAM: Industrialization Innovations Inhouse Logistics (ILOG)

Qualifications / Experience

  • Bachelors degree in Computer Science Data Science Statistics Business Analytics or a related field (or equivalent practical experience)
  • Minimum of 5 years hands-on experience in data analysis data engineering or analytics roles with significant Azure exposure
  • Demonstrable experience delivering production-ready data solutions dashboards and documentation in an enterprise environment

Essential Skills Requirements

  • Strong experience with Azure data services (Azure Data Factory Azure SQL Database Azure Data Lake Storage Azure Data Explorer/Kusto)
  • Proficiency in SQL for data extraction transformation and analysis
  • Experience building ETL/ELT pipelines and data integration workflows in Azure Data Factory Pipelines or Databricks Workflows
  • Skilled in dimensional modelling and designing data models (e.g. star/snowflake schemas)
  • Experience with data visualisation tools (Tableau/Power BI/Celonis) and building interactive dashboards and reports
  • Knowledge of data warehousing concepts and best practices including partitioning and indexing strategies
  • Solid analytical skills with experience in translating business requirements into technical data solutions
  • Familiarity with data governance data quality practices and metadata management
  • Proficiency in scripting/programming for data tasks (Python PySpark)
  • Strong communication skills to present insights to technical and non-technical stakeholders

Advantageous Skills Requirements

  • Familiarity with Databricks on Azure and Spark-based processing
  • Knowledge of CI/CD for data pipelines (Azure DevOps Git integration)
  • Experience with data cataloguing tools (e.g. Unity Catalog) and lineage tracking
  • Understanding of security and access control in Azure (role-based access managed identities)
  • Experience with real-time/streaming data solutions (Event Hubs Stream Analytics Kafka)
  • Exposure to data validation and testing frameworks (e.g. Great Expectations)
  • Prior experience working in Agile teams and using tools like Jira and Confluence
  • Azure certification(s) such as Azure Data Engineer Associate or Azure Data Scientist Associate advantageous

Role Requirements

  • Design develop and maintain robust data pipelines and ETL/ELT processes using Azure Data Factory Databricks Pipelines and related services
  • Create and optimise data models and data warehousing solutions to support reporting and analytics needs
  • Build high-quality interactive reports and dashboards; translate business requirements into insightful visualisations
  • Work closely with business stakeholders to gather requirements define KPIs and deliver actionable analytics
  • Implement and enforce data governance data quality checks and best practices across datasets and pipelines
  • Develop SQL scripts stored procedures and Python/PySpark code for data transformation and analysis
  • Collaborate with data engineers data scientists and platform teams to integrate analytical solutions into the wider data platform
  • Monitor and tune performance of queries data loads and data storage to ensure cost-efficient operations
  • Document data models pipeline designs data dictionaries and runbooks for handover and operational support
  • Support data ingestion from diverse sources including APIs databases and streaming platforms
  • Contribute to automation and CI/CD practices for data solution deployments using Git

NB:

South African citizens/residents preferred. Valid work permit holders will be considered. By applying you consent to be added to the database and to receive updates until you unsubscribe. If you do not receive a response within 2 weeks please consider your application unsuccessful.

#isanqa #DataScientist #Senior #Azure #DataEngineering #ETL #PowerBI #Databricks #DataWarehouse #Analytics #Logistics #ITHub #NowHiring #fuelledbypassionintegrityexcellence

iSanqa is your trusted Level 2 BEE recruitment partner dedicated to continuous improvement in delivering exceptional service. Specializing in seamless placements for permanent staff temporary resources and efficient contract management and billing facilitation iSanqa Resourcing is powered by a team of professionals with an outstanding track record. With over 100 years of combined experience we are committed to evolving our practices to ensure ongoing excellence.

Our client is seeking an expert data engineer who transforms intralogistics operations through robust data pipelines and analytics solutions that power autonomous transport systems and robotics innovation! Design and optimize enterprise data pipelines warehouses and analytics dashboards using Azur...
View more view more

Key Skills

  • IVR
  • SOAP
  • Avaya
  • Solaris
  • Cost Accounting Standards
  • Database Design
  • Hibernate
  • ITIL
  • Weblogic
  • Express.js
  • Contracts
  • ASP