Job title: Lead Analytics Engineer
Reporting to: Head of Data Engineering
Location: Cape Town (Hybrid)
ALL STAFF APPOINTMENTS WILL BE MADE WITH DUE CONSIDERATION OF THE COMPANYS EE TARGETS
WHAT WE DO
Lula is an innovative and humanfocused FinTech company on a mission to help small businesses optimise their cash flow. Our purpose is to help SMEs manage their businesses better faster and more simply so they can spend more time doing what they love.
If youre looking for a new place to call home that believes in the potential of the broader SME landscape in South Africa and a place where youll work with awesome people then Lulas the place for you!
Were making business banking fast human Lula!
OUR VALUES
Collaborative were a clan and work together as a team always towards a common goal
Committed were accountable and follow through no matter the challenge
Curious we look for better ways to do things and make a positive difference
Connected we stay close to learn from and look to understand each other and our customers
Compassionate we go out of our way to care about our colleagues our customers and our community
OVERALL PURPOSE
We are seeking a Lead Analytics Engineer specialising in FinTech with a strong preference for experience in the Banking and/or Credit industry. This is a handson leadership role that will be instrumental in building maintaining and optimising our analytics infrastructure. You will lead a team of engineers while actively working to design and develop data models business data workflow optimisations and ensuring data quality across all analytics and data science processes. This role combines leadership technical expertise and strategic vision to drive datadriven decisionmaking across the organisation.
Key Responsibilities:
- Lead and Mentor: Manage and guide a team of analytics engineers ensuring best practices in coding data modeling and engineering.
- Handson Engineering: Actively participate in designing building and optimising scalable data pipelines (ETL/ELT) using tools like DBT SQL and Snowflake.
- Data Warehousing: Architect implement and maintain our data warehouse (Snowflake) to ensure data is accessible reliable and optimized for performance..
- DBT Implementation: Design and develop DBT models and workflows to transform raw data into actionable insights empowering data scientists and analysts with clean and structured data for analysis and operational implementations.
- Data Quality and Governance: Implement robust data validation and governance processes to ensure accuracy and consistency across data pipelines and models.
- Collaboration: Work crossfunctionally with data science product and business teams to understand and anticipate data requirements.
- Optimisation: Continuously improve the performance scalability and efficiency of data models and pipelines driving best practices in data architecture.
- Strategic Initiatives: Contribute to the overall data strategy and help drive the adoption of new tools and technologies as required by business needs.
THE COMPETENCIES WERE AFTER
- Clear and concise communication and documentation skills
- Proven ability to lead and mentor a team of engineers driving high performance
- Strong crossfunctional collaboration skills with the ability to translate business needs into technical solutions
- Processorientated with experience in Agile
- Critical thinking skills
- Problemsolving abilities with a focus on proactive issue resolution
- Skilled in balancing multiple projects and delivering on time
- Focused on high quality output
- Selfstarter
THE SKILLS AND EXPERIENCE WERE LOOKING FOR
- Bachelors degree in Computer Science Data Science or related field. A masters degree is a plus.
- 5 years of experience as a Data/Analytics Engineer with 2 years in a leadership or senior role.
- Strong familiarity with the FinTech Banking or Credit industries.
- Expertise in data warehousing technologies particularly Snowflake.
- Extensive experience using DBT to develop complex data models.
- Deep knowledge of data integration ETL/ELT pipelines and orchestration frameworks.
- Advanced SQL skills and experience optimising complex queries for performance.
- Experience with cloud data platforms (AWS GCP or Azure) and associated data storage/processing services.
- Proficiency in Python or other programming languages for data processing.
Nice to Have:
- Experience with BI tools like Looker Tableau or Power BI.
- Familiarity with data cataloging tools and data governance frameworks.
- Knowledge of financial regulations (e.g. GDPR PCI DSS) and how they impact data management.
Please note that all appointments are subject to our background checking process which may include Credit Criminal and any other job inherent checks