Data Developer with Snowflake P&C Insurance Data Engineering Python and TALEND
Role Overview
We are seeking an experienced Data Developer / Data Engineer with strong insurance domain expertise to design develop and maintain scalable data solutions supporting analytics reporting and operational use cases. This role requires hands-on experience with modern cloud data platforms-including Snowflake-and strong programming skills in Python along with deep understanding of insurance data.
The ideal candidate will bridge technical data engineering capabilities with insurance business knowledge to deliver high-quality reliable data assets.
Key Responsibilities
- Design build and maintain end-to-end data engineering pipelines (ETL/ELT) for insurance data
- Develop and optimize data solutions using Snowflake as a cloud data warehouse
- Use Python to support data ingestion transformation automation and orchestration
- Integrate data from core insurance systems (Policy Claims Billing Underwriting Reinsurance)
- Model insurance data for analytical actuarial financial and regulatory reporting use cases
- Write complex high-performance SQL queries and transformations
- Ensure data quality validation lineage and governance standards
- Collaborate with business stakeholders analysts and architects to translate insurance requirements into technical solutions
- Troubleshoot and resolve data pipeline performance and data integrity issues
- Document data models pipelines and best practices
Required Qualifications
- 5 years of experience in Data Development or Data Engineering roles
- Strong insurance domain experience (P&C Life Health or Specialty Insurance)
- Hands-on experience with Snowflake (data modeling performance tuning security cost optimization)
- Proficiency in Python for data processing and automation
- Experience building and maintaining scalable data pipelines
- Strong understanding of insurance data concepts (policies premiums claims losses exposures)
- Experience working with large complex datasets
- Strong analytical troubleshooting and communication skills
Preferred / Nice-to-Have Skills
- Cloud platforms: Azure AWS or GCP
- Azure Data Factory Azure Data Lake Databricks Synapse
- Experience with orchestration tools (Airflow Azure Data Factory dbt or similar)
- Familiarity with BI and reporting tools (Power BI Tableau Looker)
- Experience with insurance platforms such as Guidewire Duck Creek Majesco
- Knowledge of data governance metadata management and regulatory reporting
- Experience working in Agile/Scrum environments
Education
Bachelors degree in Computer Science Data EngineeringInformation Systems or a related field (or equivalent experience)
Data Developer with Snowflake P&C Insurance Data Engineering Python and TALEND Role Overview We are seeking an experienced Data Developer / Data Engineer with strong insurance domain expertise to design develop and maintain scalable data solutions supporting analytics reporting and operational use...
Data Developer with Snowflake P&C Insurance Data Engineering Python and TALEND
Role Overview
We are seeking an experienced Data Developer / Data Engineer with strong insurance domain expertise to design develop and maintain scalable data solutions supporting analytics reporting and operational use cases. This role requires hands-on experience with modern cloud data platforms-including Snowflake-and strong programming skills in Python along with deep understanding of insurance data.
The ideal candidate will bridge technical data engineering capabilities with insurance business knowledge to deliver high-quality reliable data assets.
Key Responsibilities
- Design build and maintain end-to-end data engineering pipelines (ETL/ELT) for insurance data
- Develop and optimize data solutions using Snowflake as a cloud data warehouse
- Use Python to support data ingestion transformation automation and orchestration
- Integrate data from core insurance systems (Policy Claims Billing Underwriting Reinsurance)
- Model insurance data for analytical actuarial financial and regulatory reporting use cases
- Write complex high-performance SQL queries and transformations
- Ensure data quality validation lineage and governance standards
- Collaborate with business stakeholders analysts and architects to translate insurance requirements into technical solutions
- Troubleshoot and resolve data pipeline performance and data integrity issues
- Document data models pipelines and best practices
Required Qualifications
- 5 years of experience in Data Development or Data Engineering roles
- Strong insurance domain experience (P&C Life Health or Specialty Insurance)
- Hands-on experience with Snowflake (data modeling performance tuning security cost optimization)
- Proficiency in Python for data processing and automation
- Experience building and maintaining scalable data pipelines
- Strong understanding of insurance data concepts (policies premiums claims losses exposures)
- Experience working with large complex datasets
- Strong analytical troubleshooting and communication skills
Preferred / Nice-to-Have Skills
- Cloud platforms: Azure AWS or GCP
- Azure Data Factory Azure Data Lake Databricks Synapse
- Experience with orchestration tools (Airflow Azure Data Factory dbt or similar)
- Familiarity with BI and reporting tools (Power BI Tableau Looker)
- Experience with insurance platforms such as Guidewire Duck Creek Majesco
- Knowledge of data governance metadata management and regulatory reporting
- Experience working in Agile/Scrum environments
Education
Bachelors degree in Computer Science Data EngineeringInformation Systems or a related field (or equivalent experience)
View more
View less