Design and configure enterprise-scale Azure Databricks workspaces optimized for high-severity insurance use cases.
Architect and implement Unity Catalog as the central governance layer managing access controls lineage and data compliance.
Develop ingestion pipelines using Lakeflow Connect and Auto Loader to integrate data from Policy Administration Systems Systems of Record (SOR) and legacy file servers.
Deploy and manage Mosaic AI Vector Search and Model Serving endpoints to support Retrieval Augmented Generation (RAG) workflows.
Establish and govern Databricks Agent Services frameworks to enable supervisor-worker AI architectures.
Standardize platform observability including MLflow lifecycle management model performance tracking auditability and system health monitoring.
Ensure secure Azure infrastructure integration using VNet injection Private Link and Entra ID (Azure AD).
Lead CI/CD integration and orchestration using Databricks Jobs for multi-task scheduling and automated deployments.
Ensure regulatory compliance and data privacy standards for sensitive insurance litigation and claimant data.
Qualifications :
Strong expertise in Azure cloud architecture including networking and security best practices.
Advanced knowledge of Azure Databricks including Unity Catalog (Volumes Catalogs Schemas) Serverless Compute and Delta Live Tables (DLT).
Experience implementing enterprise-grade data governance frameworks.
Hands-on experience with GenAI tooling including Vector Search indexing and agent evaluation frameworks.
Proven experience integrating structured and unstructured data from legacy systems.
Strong understanding of ML lifecycle management and observability best practices.
Solid background in CI/CD processes and engineering orchestration.
Excellent communication and stakeholder management skills
Experience in Property & Casualty (P&C) or Commercial Liability insurance domains is highly preferred.
Databricks Solutions Architect Professional or Azure Solutions Architect Expert certification is a plus.
What about languages
Advanced English level (written and spoken) is required as the role involves collaboration with cross-functional and client-facing teams.
How much experience must I have
Minimum of 7 years of experience in cloud architecture with a proven track record delivering enterprise-scale Databricks implementations.
Additional Information :
Our Perks and Benefits:
Learning Opportunities:
- Certifications in AWS (we are AWS Partners) Databricks and Snowflake.
- Access to AI learning paths to stay up to date with the latest technologies.
- Study plans courses and additional certifications tailored to your role.
- Access to Udemy Business offering thousands of courses to boost your technical and soft skills.
- English lessons to support your professional communication.
Travel opportunities to attend industry conferences and meet clients.
Mentoring and Development:
- Career development plans and mentorship programs to help shape your path.
Celebrations & Support:
- Special day rewards to celebrate birthdays work anniversaries and other personal milestones.
- Company-provided equipment.
Flexible working options to help you strike the right balance.
Other benefits may vary according to your location in LATAM. For detailed information regarding the benefits applicable to your specific location please consult with one of our recruiters.
So what are the next steps
Our team is eager to learn about you! Send us your resume or LinkedIn profile below and well explore working together!
Remote Work :
Yes
Employment Type :
Full-time
Design and configure enterprise-scale Azure Databricks workspaces optimized for high-severity insurance use cases.Architect and implement Unity Catalog as the central governance layer managing access controls lineage and data compliance.Develop ingestion pipelines using Lakeflow Connect and Auto Loa...
Design and configure enterprise-scale Azure Databricks workspaces optimized for high-severity insurance use cases.
Architect and implement Unity Catalog as the central governance layer managing access controls lineage and data compliance.
Develop ingestion pipelines using Lakeflow Connect and Auto Loader to integrate data from Policy Administration Systems Systems of Record (SOR) and legacy file servers.
Deploy and manage Mosaic AI Vector Search and Model Serving endpoints to support Retrieval Augmented Generation (RAG) workflows.
Establish and govern Databricks Agent Services frameworks to enable supervisor-worker AI architectures.
Standardize platform observability including MLflow lifecycle management model performance tracking auditability and system health monitoring.
Ensure secure Azure infrastructure integration using VNet injection Private Link and Entra ID (Azure AD).
Lead CI/CD integration and orchestration using Databricks Jobs for multi-task scheduling and automated deployments.
Ensure regulatory compliance and data privacy standards for sensitive insurance litigation and claimant data.
Qualifications :
Strong expertise in Azure cloud architecture including networking and security best practices.
Advanced knowledge of Azure Databricks including Unity Catalog (Volumes Catalogs Schemas) Serverless Compute and Delta Live Tables (DLT).
Experience implementing enterprise-grade data governance frameworks.
Hands-on experience with GenAI tooling including Vector Search indexing and agent evaluation frameworks.
Proven experience integrating structured and unstructured data from legacy systems.
Strong understanding of ML lifecycle management and observability best practices.
Solid background in CI/CD processes and engineering orchestration.
Excellent communication and stakeholder management skills
Experience in Property & Casualty (P&C) or Commercial Liability insurance domains is highly preferred.
Databricks Solutions Architect Professional or Azure Solutions Architect Expert certification is a plus.
What about languages
Advanced English level (written and spoken) is required as the role involves collaboration with cross-functional and client-facing teams.
How much experience must I have
Minimum of 7 years of experience in cloud architecture with a proven track record delivering enterprise-scale Databricks implementations.
Additional Information :
Our Perks and Benefits:
Learning Opportunities:
- Certifications in AWS (we are AWS Partners) Databricks and Snowflake.
- Access to AI learning paths to stay up to date with the latest technologies.
- Study plans courses and additional certifications tailored to your role.
- Access to Udemy Business offering thousands of courses to boost your technical and soft skills.
- English lessons to support your professional communication.
Travel opportunities to attend industry conferences and meet clients.
Mentoring and Development:
- Career development plans and mentorship programs to help shape your path.
Celebrations & Support:
- Special day rewards to celebrate birthdays work anniversaries and other personal milestones.
- Company-provided equipment.
Flexible working options to help you strike the right balance.
Other benefits may vary according to your location in LATAM. For detailed information regarding the benefits applicable to your specific location please consult with one of our recruiters.
So what are the next steps
Our team is eager to learn about you! Send us your resume or LinkedIn profile below and well explore working together!
Remote Work :
Yes
Employment Type :
Full-time
View more
View less