Overview:
Our client is looking for a skilled AI Engineer to join their team. This position operates within a large-scale insurance data environment spanning customer agent transaction and policy datasets and focuses on applying advanced AI to generate business value.
You will be responsible for developing and deploying scalable solutions using Generative AI and predictive modeling turning data from an Azure-based data lake into meaningful insights and intelligent automated capabilities. The role involves close collaboration with data engineers platform teams and business stakeholders to bring AI solutions into production at an enterprise level.
Key Responsibilities:
AI & Model Development:
- Build train and deploy predictive ML models and LLM-based applications including RAG (Retrieval-Augmented Generation) solutions
- Utilize Azure Machine Learning and Azure AI Foundry to create scalable production-grade AI systems
Data Integration & Collaboration:
- Work with data engineering teams to ingest and process large datasets (e.g. Parquet CSV text) from Azure Data Lake Storage and Azure Synapse
- Ensure smooth alignment between data pipelines and AI processes
Application Architecture:
- Develop serverless orchestration using Azure Functions to connect AI models with applications and APIs
- Support both real-time and batch inference scenarios
Search & Storage Optimization:
- Implement Azure AI Search for fast retrieval of model outputs embeddings and metadata
- Help design efficient storage strategies for data and metadata
MLOps & Operational Excellence:
- Apply MLOps practices such as model versioning monitoring and lifecycle management
- Integrate AI workflows into CI/CD pipelines for automated testing and deployment
Requirements
Technical Requirements:
Azure AI & Cloud:
- Practical experience with Azure Machine Learning Azure AI Foundry and Azure OpenAI Service
Data & Analytics:
- Strong knowledge of Azure Data Lake Storage and Azure Synapse Analytics
- Experience handling large-scale data environments and working with formats like Parquet
Programming & Frameworks:
- Proficiency in Python for AI/ML development
- Familiarity with Scala and Apache Spark in enterprise ETL contexts
Backend & Storage:
- Experience building serverless applications using Azure Functions
- Knowledge of Azure Cosmos DB or similar NoSQL databases
DevOps & Automation:
- Experience with CI/CD tools such as Azure DevOps or GitHub Actions for automating ML workflows
Preferred Qualifications:
- Background in insurance or financial services
- Experience deriving insights from complex customer and transaction data
- Hands-on experience building RAG-based AI systems
- Familiarity with distributed data processing using Azure Databricks
Required Skills:
Overview:
Our client is looking for a skilled AI Engineer to join their team. This position operates within a large-scale insurance data environment spanning customer agent transaction and policy datasets and focuses on applying advanced AI to generate business value.
You will be responsible for developing and deploying scalable solutions using Generative AI and predictive modeling turning data from an Azure-based data lake into meaningful insights and intelligent automated capabilities. The role involves close collaboration with data engineers platform teams and business stakeholders to bring AI solutions into production at an enterprise level.
Key Responsibilities:
AI & Model Development:
- Build train and deploy predictive ML models and LLM-based applications including RAG (Retrieval-Augmented Generation) solutions
- Utilize Azure Machine Learning and Azure AI Foundry to create scalable production-grade AI systems
Data Integration & Collaboration:
- Work with data engineering teams to ingest and process large datasets (e.g. Parquet CSV text) from Azure Data Lake Storage and Azure Synapse
- Ensure smooth alignment between data pipelines and AI processes
Application Architecture:
- Develop serverless orchestration using Azure Functions to connect AI models with applications and APIs
- Support both real-time and batch inference scenarios
Search & Storage Optimization:
- Implement Azure AI Search for fast retrieval of model outputs embeddings and metadata
- Help design efficient storage strategies for data and metadata
MLOps & Operational Excellence:
- Apply MLOps practices such as model versioning monitoring and lifecycle management
- Integrate AI workflows into CI/CD pipelines for automated testing and deployment
Requirements
Technical Requirements:
Azure AI & Cloud:
- Practical experience with Azure Machine Learning Azure AI Foundry and Azure OpenAI Service
Data & Analytics:
- Strong knowledge of Azure Data Lake Storage and Azure Synapse Analytics
- Experience handling large-scale data environments and working with formats like Parquet
Programming & Frameworks:
- Proficiency in Python for AI/ML development
- Familiarity with Scala and Apache Spark in enterprise ETL contexts
Backend & Storage:
- Experience building serverless applications using Azure Functions
- Knowledge of Azure Cosmos DB or similar NoSQL databases
DevOps & Automation:
- Experience with CI/CD tools such as Azure DevOps or GitHub Actions for automating ML workflows
Preferred Qualifications:
- Background in insurance or financial services
- Experience deriving insights from complex customer and transaction data
- Hands-on experience building RAG-based AI systems
- Familiarity with distributed data processing using Azure Databricks
Required Education:
JLPT N1
Overview:Our client is looking for a skilled AI Engineer to join their team. This position operates within a large-scale insurance data environment spanning customer agent transaction and policy datasets and focuses on applying advanced AI to generate business value.You will be responsible for devel...
Overview:
Our client is looking for a skilled AI Engineer to join their team. This position operates within a large-scale insurance data environment spanning customer agent transaction and policy datasets and focuses on applying advanced AI to generate business value.
You will be responsible for developing and deploying scalable solutions using Generative AI and predictive modeling turning data from an Azure-based data lake into meaningful insights and intelligent automated capabilities. The role involves close collaboration with data engineers platform teams and business stakeholders to bring AI solutions into production at an enterprise level.
Key Responsibilities:
AI & Model Development:
- Build train and deploy predictive ML models and LLM-based applications including RAG (Retrieval-Augmented Generation) solutions
- Utilize Azure Machine Learning and Azure AI Foundry to create scalable production-grade AI systems
Data Integration & Collaboration:
- Work with data engineering teams to ingest and process large datasets (e.g. Parquet CSV text) from Azure Data Lake Storage and Azure Synapse
- Ensure smooth alignment between data pipelines and AI processes
Application Architecture:
- Develop serverless orchestration using Azure Functions to connect AI models with applications and APIs
- Support both real-time and batch inference scenarios
Search & Storage Optimization:
- Implement Azure AI Search for fast retrieval of model outputs embeddings and metadata
- Help design efficient storage strategies for data and metadata
MLOps & Operational Excellence:
- Apply MLOps practices such as model versioning monitoring and lifecycle management
- Integrate AI workflows into CI/CD pipelines for automated testing and deployment
Requirements
Technical Requirements:
Azure AI & Cloud:
- Practical experience with Azure Machine Learning Azure AI Foundry and Azure OpenAI Service
Data & Analytics:
- Strong knowledge of Azure Data Lake Storage and Azure Synapse Analytics
- Experience handling large-scale data environments and working with formats like Parquet
Programming & Frameworks:
- Proficiency in Python for AI/ML development
- Familiarity with Scala and Apache Spark in enterprise ETL contexts
Backend & Storage:
- Experience building serverless applications using Azure Functions
- Knowledge of Azure Cosmos DB or similar NoSQL databases
DevOps & Automation:
- Experience with CI/CD tools such as Azure DevOps or GitHub Actions for automating ML workflows
Preferred Qualifications:
- Background in insurance or financial services
- Experience deriving insights from complex customer and transaction data
- Hands-on experience building RAG-based AI systems
- Familiarity with distributed data processing using Azure Databricks
Required Skills:
Overview:
Our client is looking for a skilled AI Engineer to join their team. This position operates within a large-scale insurance data environment spanning customer agent transaction and policy datasets and focuses on applying advanced AI to generate business value.
You will be responsible for developing and deploying scalable solutions using Generative AI and predictive modeling turning data from an Azure-based data lake into meaningful insights and intelligent automated capabilities. The role involves close collaboration with data engineers platform teams and business stakeholders to bring AI solutions into production at an enterprise level.
Key Responsibilities:
AI & Model Development:
- Build train and deploy predictive ML models and LLM-based applications including RAG (Retrieval-Augmented Generation) solutions
- Utilize Azure Machine Learning and Azure AI Foundry to create scalable production-grade AI systems
Data Integration & Collaboration:
- Work with data engineering teams to ingest and process large datasets (e.g. Parquet CSV text) from Azure Data Lake Storage and Azure Synapse
- Ensure smooth alignment between data pipelines and AI processes
Application Architecture:
- Develop serverless orchestration using Azure Functions to connect AI models with applications and APIs
- Support both real-time and batch inference scenarios
Search & Storage Optimization:
- Implement Azure AI Search for fast retrieval of model outputs embeddings and metadata
- Help design efficient storage strategies for data and metadata
MLOps & Operational Excellence:
- Apply MLOps practices such as model versioning monitoring and lifecycle management
- Integrate AI workflows into CI/CD pipelines for automated testing and deployment
Requirements
Technical Requirements:
Azure AI & Cloud:
- Practical experience with Azure Machine Learning Azure AI Foundry and Azure OpenAI Service
Data & Analytics:
- Strong knowledge of Azure Data Lake Storage and Azure Synapse Analytics
- Experience handling large-scale data environments and working with formats like Parquet
Programming & Frameworks:
- Proficiency in Python for AI/ML development
- Familiarity with Scala and Apache Spark in enterprise ETL contexts
Backend & Storage:
- Experience building serverless applications using Azure Functions
- Knowledge of Azure Cosmos DB or similar NoSQL databases
DevOps & Automation:
- Experience with CI/CD tools such as Azure DevOps or GitHub Actions for automating ML workflows
Preferred Qualifications:
- Background in insurance or financial services
- Experience deriving insights from complex customer and transaction data
- Hands-on experience building RAG-based AI systems
- Familiarity with distributed data processing using Azure Databricks
Required Education:
JLPT N1
View more
View less