Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailWe re looking for a Data Engineer who understands that behind every row of data is a human decision. Youll be responsible for building and maintaining the data infrastructure that fuels our lending models and powers our operations all on AWS.
You ll also support the machine learning lifecycle helping our data scientists deploy monitor and retrain models with the right data at the right time.
This isn t just a job. It s a chance to help build the financial backbone of one of Africa s most ambitious digital banks.
Key Responsibilities
Build & Maintain Pipelines
Develop and operate ETL/ELT pipelines using AWS Glue Lambda Athena and Step Functions. Your work supports both reporting and realtime decisionmaking.
Curate a Robust Data Lake house
Structure and maintain our data lake with proper partitioning schema evolution metadata tagging and access control all across multiple jurisdictions.
Support MLOps Lifecycle
Work closely with data scientists to deploy models using SageMaker Pipelines update the Feature Store and set up triggers for model retraining.
Ensure Precision & Integrity
Monitor pipeline outputs and data quality dashboards. Every number we serve must be accurate traceable and reproducible always.
Automate Audit & Secure
Use infrastructureascode (Pulumi or Terraform) to build reproducible infrastructure. Implement logging versioning and KMSencrypted security practices.
Collaborate with Impact
Work across analytics engineering and credit teams to understand operational needs and translate them into technical pipelines and products.
Required Skills & Experience
Willing to take ownership (not just pass the buck)
Willing to learn
4 years in data engineering or cloud data architecture roles
Solid experience with AWS data stack: S3 Glue Athena Lambda Step Functions
Proficient in SQL Python and PySpark
Comfort with data lake architecture and handling semistructured data (Parquet JSON)
Experience with MLOps model deployment pipelines and monitoring
Exposure to infrastructureascode (Pulumi preferred Terraform acceptable)
Familiarity with secure data handling and anonymization best practices
Nice to Have
Experience with eventdriven data flows (e.g. Kafka EventBridge)
Familiarity with SageMaker Feature Store and SageMaker Pipelines
Background in financial services credit scoring or mobile money ecosystems
Passion for building ethical inclusive financial systems
Full Time