We re looking for an experienced Senior Data Engineer to join our growing Data Insights & Analytics team. You ll play a critical role in designing building and scaling the data infrastructure that powers our core products and clientfacing insights.
In this role you ll architect data solutions using modern Azure technologies including Microsoft Fabric Synapse Azure SQL and Data Factory. Youll develop robust pipelines to process transform and model complex insurance data into structured reliable datasets that fuel analytics dashboards and data science products.
In addition to your technical responsibilities your daily routine will include participating in standup meetings managing work items based on your capacity collaborating with the team s growing technology team to define new projects or initiatives and engage in development activities. In addition to traditional data engineering tasks you will directly interact with the teams developing the tools we utilize enabling you to provide direct product feedback and witness your input driving changes in the products over time.
Key Responsibilities
Design and build robust scalable ETL/ELT pipelines using Azure Data Factory Synapse and Microsoft Fabric
Model and transform raw insurance data into structured datasets for reporting and analytics use cases
Collaborate with analysts engineers and business stakeholders to align data solutions with company goals
Optimize performance of data workflows and pipelines to support realtime and batch processing scenarios
Drive best practices in data governance documentation code quality and DevOps automation
Monitor production workloads troubleshoot pipeline failures and support live environments
Evaluate new Azure data services and tools for potential adoption
Key Skills & Expertise
Data Engineering: Advanced ETL/ELT experience with large complex data sets
Azure Stack: Strong knowledge of Azure Data Factory Synapse Analytics Azure SQL and Microsoft Fabric
Spark Ecosystem: Experience with Spark development using PySpark and Spark SQL (especially in Fabric notebooks)
Data Modeling: Expertise in dimensional modeling normalization and schema design
Coding Proficiency: Fluent in Python SQL Spark SQL and scripting for orchestration and automation
Performance Tuning: Familiarity with optimizing query performance parallel processing and cloudbased workloads
Qualifications
Bachelor s degree in Computer Science Data Engineering or related field (Master s a plus)
5 years of experience in data engineering or analytics engineering roles
3 years working with Azure data services and cloudnative platforms
Experience with Microsoft Fabric is highly desirable
Proven ability to transform business requirements into scalable maintainable data workflows
Experience working with Lloyd s of London bordereaux data is a strong plus particularly in contexts involving ingestion validation and transformation of complex insurance data sets.
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.