DescriptionThe AI Engineer is a key role in the Data Platform Portfolio team building the data platform driving value from data across the business.
With strong technical skills and business acumen to help turn millions of potential data points into actionable insights that can drive product improvements make our customer acquisition more efficient improve our customer retention rates and drive operating efficiencies across the business.
The primary goals of the team are:
- To build and run a data platform which can create and deliver analytics to colleagues and deliver regulatory reporting
- Ingest and transform data from multiple systems modelling data and engineering data marts to create reusable data assets
- To create a self service BI platform enabling colleagues across Rentokil Initial to get value from data
- To build AI models and a data science platform that enables Rentokil to derive huge value from AI through machine learning to gen AI and beyond
Key tasks:
- AI Data Engineering: Design build operate and deploy realtime data pipelines at scale using AI techniques and best practices. Support Rentokils AI R&D efforts by applying advanced data warehousing data science and data engineering technologies. Aim for automation to enable a faster timetomarket and better reusability of new AI initiatives.
- Collaboration: Work in tandem with the AI team to collect create curate and maintain highquality AI datasets. Ensure alignment of data architecture and data models across different products and platforms.
- Handson Involvement: Engage in data engineering tasks as required to support the team and the projects. Conduct and own external data collection efforts including stateoftheart prompt engineering techniques to support the construction of stateoftheart AI models.
- The role involves developing finetuning and optimising large language models (LLMs) for corporate use cases such as querying structured data or automating analytics workflows.
- Designing and refining prompts to improve LLM performance in structured data querying and other businessspecific applications
- Integrate LLMs with structured data systems (e.g. SQL databases BigQuery GCS) to enable natural language querying and advanced analytics.
- Implementing MLOps/LLMOps pipelines for deploying LLMs in production monitoring their performance and ensuring scalability.
- Evaluating LLM performance optimising hyperparameters and ensuring alignment with business objectives.
- Develop and maintain data integration processes to ensure data quality and accuracy in the data platform
RequirementsData Engineering and Preprocessing
- Extensive experience in data collection preprocessing and integration from various sources ensuring accuracy consistency and handling missing values or outliers.
- Proficient in designing and implementing ELT pipelines using tools like dbt with strong knowledge of data warehousing data lake concepts and data pipeline optimization.
- Skilled in SQL for data manipulation analysis query optimisation and database design.
Artificial Intelligence and Machine Learning
- Understanding of machine learning algorithms (classification regression clustering) and their practical applications.
- Handson experience with natural language processing (NLP) techniques and developing custom solutions using large language models (LLMs) for business use cases.
- Proficient in Pythonbased AI/ML development using frameworks like TensorFlow PyTorch and Scikitlearn.
LLM Orchestration and Development
- Expertise in building LLMpowered applications using frameworks such as LangChain and LangGraph including prompt engineering finetuning and workflow orchestration.
- Skilled in integrating LLMs with structured data systems (e.g. SQL databases BigQuery) to enable natural language querying and advanced analytics.
MLOps/LLMOps
- Proficient in designing and implementing MLOps/LLMOps pipelines for model deployment monitoring version control and CI/CD workflows.
- Strong understanding of model performance evaluation hyperparameter tuning and maintenance using tools like Vertex AI Pipelines.
Cloud Computing (Google Cloud Platform GCP Preferred)
- Handson experience with GCP services such as Vertex AI BigQuery Cloud SQL and Google Cloud Storage (GCS) for AI/ML applications.
- Skilled in containerization (Docker) and orchestration (Kubernetes GKE) with a solid understanding of cloud security best practices.
Benefits - Competitive salary and bonus scheme
- Hybrid working
- Rentokil Initial Reward Scheme
- 23 days holiday plus 8 bank holidays
- Employee Assistance Programme
- Death in service benefit
- Healthcare
- Free parking
At Rentokil Initial our customers and colleagues represent diverse backgrounds and experiences. We take pride in being an equal opportunity employer actively encouraging applications from individuals from all walks of life. Our belief is that everyone irrespective of age gender gender identity gender expression ethnicity sexual orientation disabilities religion or beliefs has the potential to thrive and contribute.
We embrace the differences that make each of our colleagues unique fostering an inclusive environment where everyone can be their authentic selves and feel a sense of belonging. To ensure that your journey with us is accessible if you have any individual requirements we invite you to communicate any specific needs or preferences you may have during any stage of the recruitment process. Our team is available to support you; feel free to reach out to if you need anything
Be Yourself in Your Application! At Rentokil Initial we value innovation but we want to see the real you! While AI can help with structure and grammar make sure your application shows your true passion and understanding of the role. A personal touch will help you stand out.