drjobs Group Manager - RNA

Group Manager - RNA

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Bangalore - India

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Position Overview:We are seeking a highly skilled Senior Data Engineer Python PySpark & Azure Databricks to join our dynamic data engineering team. This role focuses on building scalable high-performance data pipelines using Python and PySpark within the Azure Databricks environment. While familiarity with broader Azure services is valuable the emphasis is on distributed data processing and automation using modern big data frameworks. Prior experience in the Property & Casualty (P&C) insurance industry is a strong Responsibilities:Data Pipeline Development & Optimization:Design develop and maintain scalable ETL/ELT data pipelines using Python and Azure Databricks to process large volumes of structured and semi-structured data data quality checks error handling and performance tuning across all stages of data Architecture & Modeling:Contribute to the design of cloud-based data architectures that support analytics and reporting use and maintain data models that adhere to industry best practices and support business with Delta Lake Bronze/Silver/Gold data architecture patterns and metadata management Integration (Azure):Integrate and orchestrate data workflows using Azure Data Factory Azure Blob Storage and Event Hub where cloud compute resources and manage cost-effective data processing at & Stakeholder Engagement:Partner with data analysts data scientists and business users to understand evolving data with DevOps and platform teams to ensure reliable secure and automated data in Agile and contribute to sprint planning demos and & Best Practices:Maintain clear and comprehensive documentation of code pipelines and architectural to internal data engineering standards and promote best practices for code quality testing and CI/CD.


Qualifications :

Qualifications:Educational Background:Bachelors degree in Computer Science Information Systems Engineering or a related degree or relevant certifications are a Experience:5-7 years of experience in data engineering with a strong focus on Python and -on experience building and maintaining Azure Databricks workloads in production background in designing large-scale data processing systems with structured and semi-structured Proficiency:Expert-level skills in Python and PySpark for distributed data with Delta Lake Parquet and other big data file in Azure data services especially Databricks ADF Azure Data Lake Storage (ADLS).Experience with CI/CD pipelines source control (Git) and DevOps with SQL and data warehousing principles is Experience:Prior experience in the Property & Casualty (P&C) insurance industry is of insurance datasets (e.g. policy data claims exposures) is a strong Skills:Strong problem-solving and critical thinking verbal and written communication -oriented mindset with the ability to work independently in a fast-paced to detail and a strong commitment to data quality and (Preferred):Microsoft Certified: Azure Data Engineer AssociateDatabricks Certified Data Engineer Associate or ProfessionalDemonstrates proficiency in cloud-based big data platforms Spark-based processing and modern data engineering practices.


Remote Work :

No


Employment Type :

Full-time

Employment Type

Full-time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.