The team addresses many use cases for Data Science involving a variety of Machine Learning and AI technologies. For example:
- Scoring the risk of business failure with Machine Learning models
- Extracting information from images and documents with Transformers and LLMs
- Identifying companies with a custom search engine using embeddings
- Building and mining knowledge graphs
- Financial modeling and simulations
The Data Labs work includes solid Software Engineering practices to achieve operational integrations in internal tools and client applications. The technology stack is mainly based on Python but also includes ReactJS and Java. It covers data extractions from the companys databases and management of CI/CD pipelines to deploy applications and APIs on Docker/Kubernetes infrastructures Apache Airflow and in the Cloud.
The team continuously delivers several projects with short development lifecycles and operational deployments of impacting solutions. For example:
- APIs that leverage our data for major international accounts
- Web scraping and open data sourcing in all geographies
- Data processing pipelines to feed Machine Learning models and AI solutions
- User Interfaces to expose AI-powered solutions developed by the team
Your mission:
- You work in pairs or small teams on innovative projects to design and develop solutions deployed on our container infrastructures or in the cloud and implement production monitoring
- You participate in code reviews with other team members to share knowledge and ensure the quality of developments
- You propose technical solutions and participate in technology watch.
Qualifications :
The profile we are looking for:
- Engineering degree in Software Engineering or equivalent.
- Minumim 2 years of demostrated experience as Software Engineer
- Python application development with high quality standards and implementation of best practices in code reviews and continuous integration.
- Knowledge of CI/CD pipelines and Docker/Kubernetes deployments.
- Use of SQL and NoSQL databases (MongoDB Neo4j ElasticSearch Redis etc.)
- Implementation of monitoring dashboards (Kibana Grafana)
- Autonomy curiosity teamwork and knowledge sharing
- English and French are mandatory: interactions with teams all around the world
Other appreciated skills :
- Designing high-availability low latency application architectures
- Skills in data science libraries (e.g. Pandas)
- Knowledge of Linux administration
- Cloud application deployment and infrastructure-as-code
- Front-end programming (React Angular etc.)
- DevOps
Additional Information :
Recruitment process :
- Technical interview (video) including a review of the candidates professional background estimated duration: 1 hour
- Technical assessment in Python and DevOps without AI assistance preferably in our offices estimated duration: 3 hours
- Interview with the human resources department estimated duration: 45 minutes
- Job offer
Hybrid position
Remote Work Benefits
All our employees benefit from 3 days of remote work per week and a maximum monthly subsidy of 30. Employees receive vouchers worth 12 for each day of remote work.
Language Training
Depending on the type of employment contract Coface provides employees with access to an e-learning platform dedicated to learning 6 foreign languages.
Remote Work Equipment
An equipment allowance of 350 for remote work is available to employees depending on the type of employment contract.
Electric Bike Purchase
A contribution towards the purchase of an electric bike is available to employees depending on the type of employment contract.
Remote Work :
No
Employment Type :
Full-time
The team addresses many use cases for Data Science involving a variety of Machine Learning and AI technologies. For example:Scoring the risk of business failure with Machine Learning modelsExtracting information from images and documents with Transformers and LLMsIdentifying companies with a custom ...
The team addresses many use cases for Data Science involving a variety of Machine Learning and AI technologies. For example:
- Scoring the risk of business failure with Machine Learning models
- Extracting information from images and documents with Transformers and LLMs
- Identifying companies with a custom search engine using embeddings
- Building and mining knowledge graphs
- Financial modeling and simulations
The Data Labs work includes solid Software Engineering practices to achieve operational integrations in internal tools and client applications. The technology stack is mainly based on Python but also includes ReactJS and Java. It covers data extractions from the companys databases and management of CI/CD pipelines to deploy applications and APIs on Docker/Kubernetes infrastructures Apache Airflow and in the Cloud.
The team continuously delivers several projects with short development lifecycles and operational deployments of impacting solutions. For example:
- APIs that leverage our data for major international accounts
- Web scraping and open data sourcing in all geographies
- Data processing pipelines to feed Machine Learning models and AI solutions
- User Interfaces to expose AI-powered solutions developed by the team
Your mission:
- You work in pairs or small teams on innovative projects to design and develop solutions deployed on our container infrastructures or in the cloud and implement production monitoring
- You participate in code reviews with other team members to share knowledge and ensure the quality of developments
- You propose technical solutions and participate in technology watch.
Qualifications :
The profile we are looking for:
- Engineering degree in Software Engineering or equivalent.
- Minumim 2 years of demostrated experience as Software Engineer
- Python application development with high quality standards and implementation of best practices in code reviews and continuous integration.
- Knowledge of CI/CD pipelines and Docker/Kubernetes deployments.
- Use of SQL and NoSQL databases (MongoDB Neo4j ElasticSearch Redis etc.)
- Implementation of monitoring dashboards (Kibana Grafana)
- Autonomy curiosity teamwork and knowledge sharing
- English and French are mandatory: interactions with teams all around the world
Other appreciated skills :
- Designing high-availability low latency application architectures
- Skills in data science libraries (e.g. Pandas)
- Knowledge of Linux administration
- Cloud application deployment and infrastructure-as-code
- Front-end programming (React Angular etc.)
- DevOps
Additional Information :
Recruitment process :
- Technical interview (video) including a review of the candidates professional background estimated duration: 1 hour
- Technical assessment in Python and DevOps without AI assistance preferably in our offices estimated duration: 3 hours
- Interview with the human resources department estimated duration: 45 minutes
- Job offer
Hybrid position
Remote Work Benefits
All our employees benefit from 3 days of remote work per week and a maximum monthly subsidy of 30. Employees receive vouchers worth 12 for each day of remote work.
Language Training
Depending on the type of employment contract Coface provides employees with access to an e-learning platform dedicated to learning 6 foreign languages.
Remote Work Equipment
An equipment allowance of 350 for remote work is available to employees depending on the type of employment contract.
Electric Bike Purchase
A contribution towards the purchase of an electric bike is available to employees depending on the type of employment contract.
Remote Work :
No
Employment Type :
Full-time
View more
View less