Job Description:
The companys Artificial Intelligence (AI) ambition relies strongly on the design ang evolution of IT capabilities providing robust industrial scalable and off-the-shelve services allowing to deliver Business AI use-cases.
Under the leadership of the DAIA program a set of squads of the Data & AI Tribe are working on various projects in agile mode to develop the required cloud native solutions. These squads must now source external skills and resources to cope with the acceleration of the roadmap. The resources and skills are all expected to have a full understanding of such context and implications and to be at the forefront of the related technologies with a solid experience of the industrialization and integration of such solutions in complex international distributed and highly secured cloud environments.
As part of this strategy the DAIA squad is building a Conversational AI Platform (CAP) which will enable countries to benefit from an on-the-shelf internal RAG tool bundled with a bot widget to manage their customers/partners relationship and provide a more secure digital service.
Besides this platform the squad is managing other bots providers that are used in some of our 30 countries.
To manage this platform the DAIA squad is working closely with several internal team as Analytics Data Science ITG Procurement and countries. Making it a very diverse project with a complete view of AI subjects.
Main Tasks and Responsibilities:
- As a Conversational Agent Use Cases Developer / Administrator you will contribute to run and change activities related to the usage of the CAP platform as well as to the day-to-day life of the DAIA squad to which you are directly assigned.
- Work closely with the squads Tech Leads as well as with the Tribe Tech Lead the Product Owner and other team members (notably Business Analysts who will help you gain a clear understanding of business needs). You will also take part in all the squads Agile rituals.
- Bring expertise on topics such as APIs RAG data pipeline development (Airflow) and data preparation (Elasticsearch) where a strong command of Python will be required. To support the teams work on existing bot use cases and those currently being deployed proven experience with Cloud DevSecOps and CI/CD topics is required.
- Work on onboarding new use cases including countries / new UC follow up technical support and environment creation.
- Work on scaling these new use cases on drafting a playbook help to accelerate the onboardings and deployment of new UC.
- Incident follow up and monitoring of existing bot is also a part of the job
- Technical skills:
- Stay up to date with Data AI DevSecOps AIOps and Private Cloud industry trends emerging technologies and innovative solutions to continuously improve our capabilities.
- Proficiency in Python and JavaScript programming.
- Data Pipeline development : Airflow
- Experience with API Management (APIGEE ...)
- Experience on datapreparation (Elasticsearch)
- Collaborate with subject matter experts to understand requirements and adapt tools accordingly.
- Good culture of cybersecurity is expected
- Collaborate with architects and wider technical leadership to define and implement technology strategies and roadmaps.
- Communication and Collaboration:
- Collaborate with other teams such as DevOps MLOPs AIOps Data QA and Security to ensure seamless integration and delivery of technical components.
- Participate in agile ceremonies such as sprint planning backlog review daily stand-ups sprint review and retrospectives to ensure alignment and effective collaboration.
- Assist in the integration of new AI tools and technologies.
- Understanding the data lifecycle and the ability to use user feedback for continuous improvement.
- Strong collaboration and teamwork skills.
- Attention to detail and rigor.
- Adaptability and ability to manage change.
- Excellent oral and written communication skills.
- Problem-Solving and Troubleshooting:
- Troubleshoot complex technical issues and provide solutions to ensure application stability and performance.
- Collaborate with the team to identify and prioritize technical debt and develop plans to address it.
- Be active in monitoring the application addressing correction of vulnerabilities at the security application and dependency levels.
Qualifications :
Qualifications and Technical skills:
- 5 years of experience in development
- Strongly skilled in one or more programming languages such as Python
- Strong experience on Web Services (contract first specification and API management)
- Experience on Data & AI pipelines such as Document ingestion for RAG Data processing AI model orchestration and Event-Driven architecture
- Experience on process Orchestration as Airflow
- Proven experience with cloud-based technologies (containerization - e.g. Docker; orchestration - e.g. Kubernetes helm)
- Experience with DevOps tools such as Jenkins GitLab CI/CD Argo CD
- Experience on vectorial database like ElasticSerach : basic skills generative ai rafg
- Experience of software development methodologies such as Agile
- Knowledge of security best practices
- Experience on scripting (e.g Shell script) and automation
- Experience on designing and developing application based on Microservices architecture
Language Skills:
Remote Work :
No
Employment Type :
Full-time
Job Description:The companys Artificial Intelligence (AI) ambition relies strongly on the design ang evolution of IT capabilities providing robust industrial scalable and off-the-shelve services allowing to deliver Business AI use-cases.Under the leadership of the DAIA program a set of squads of the...
Job Description:
The companys Artificial Intelligence (AI) ambition relies strongly on the design ang evolution of IT capabilities providing robust industrial scalable and off-the-shelve services allowing to deliver Business AI use-cases.
Under the leadership of the DAIA program a set of squads of the Data & AI Tribe are working on various projects in agile mode to develop the required cloud native solutions. These squads must now source external skills and resources to cope with the acceleration of the roadmap. The resources and skills are all expected to have a full understanding of such context and implications and to be at the forefront of the related technologies with a solid experience of the industrialization and integration of such solutions in complex international distributed and highly secured cloud environments.
As part of this strategy the DAIA squad is building a Conversational AI Platform (CAP) which will enable countries to benefit from an on-the-shelf internal RAG tool bundled with a bot widget to manage their customers/partners relationship and provide a more secure digital service.
Besides this platform the squad is managing other bots providers that are used in some of our 30 countries.
To manage this platform the DAIA squad is working closely with several internal team as Analytics Data Science ITG Procurement and countries. Making it a very diverse project with a complete view of AI subjects.
Main Tasks and Responsibilities:
- As a Conversational Agent Use Cases Developer / Administrator you will contribute to run and change activities related to the usage of the CAP platform as well as to the day-to-day life of the DAIA squad to which you are directly assigned.
- Work closely with the squads Tech Leads as well as with the Tribe Tech Lead the Product Owner and other team members (notably Business Analysts who will help you gain a clear understanding of business needs). You will also take part in all the squads Agile rituals.
- Bring expertise on topics such as APIs RAG data pipeline development (Airflow) and data preparation (Elasticsearch) where a strong command of Python will be required. To support the teams work on existing bot use cases and those currently being deployed proven experience with Cloud DevSecOps and CI/CD topics is required.
- Work on onboarding new use cases including countries / new UC follow up technical support and environment creation.
- Work on scaling these new use cases on drafting a playbook help to accelerate the onboardings and deployment of new UC.
- Incident follow up and monitoring of existing bot is also a part of the job
- Technical skills:
- Stay up to date with Data AI DevSecOps AIOps and Private Cloud industry trends emerging technologies and innovative solutions to continuously improve our capabilities.
- Proficiency in Python and JavaScript programming.
- Data Pipeline development : Airflow
- Experience with API Management (APIGEE ...)
- Experience on datapreparation (Elasticsearch)
- Collaborate with subject matter experts to understand requirements and adapt tools accordingly.
- Good culture of cybersecurity is expected
- Collaborate with architects and wider technical leadership to define and implement technology strategies and roadmaps.
- Communication and Collaboration:
- Collaborate with other teams such as DevOps MLOPs AIOps Data QA and Security to ensure seamless integration and delivery of technical components.
- Participate in agile ceremonies such as sprint planning backlog review daily stand-ups sprint review and retrospectives to ensure alignment and effective collaboration.
- Assist in the integration of new AI tools and technologies.
- Understanding the data lifecycle and the ability to use user feedback for continuous improvement.
- Strong collaboration and teamwork skills.
- Attention to detail and rigor.
- Adaptability and ability to manage change.
- Excellent oral and written communication skills.
- Problem-Solving and Troubleshooting:
- Troubleshoot complex technical issues and provide solutions to ensure application stability and performance.
- Collaborate with the team to identify and prioritize technical debt and develop plans to address it.
- Be active in monitoring the application addressing correction of vulnerabilities at the security application and dependency levels.
Qualifications :
Qualifications and Technical skills:
- 5 years of experience in development
- Strongly skilled in one or more programming languages such as Python
- Strong experience on Web Services (contract first specification and API management)
- Experience on Data & AI pipelines such as Document ingestion for RAG Data processing AI model orchestration and Event-Driven architecture
- Experience on process Orchestration as Airflow
- Proven experience with cloud-based technologies (containerization - e.g. Docker; orchestration - e.g. Kubernetes helm)
- Experience with DevOps tools such as Jenkins GitLab CI/CD Argo CD
- Experience on vectorial database like ElasticSerach : basic skills generative ai rafg
- Experience of software development methodologies such as Agile
- Knowledge of security best practices
- Experience on scripting (e.g Shell script) and automation
- Experience on designing and developing application based on Microservices architecture
Language Skills:
Remote Work :
No
Employment Type :
Full-time
View more
View less