drjobs Data Architect with Informatica Power BI and Azure

Data Architect with Informatica Power BI and Azure

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Erie, PA - USA

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Position: Data Architect with Informatica Power BI and Azure

Location: Erie PA (OnSite)

Duration: Long Term

 

Job Description:

Must have experience on Informatica Power BI and Azure experience.

 

About the Role:

We are seeking an experienced Data Architect to lead and design modern data solutions for a Property & Casualty (P&C) customer undergoing a major data modernization initiative involving Guidewire Claim Data Access (CDA). The ideal candidate will possess strong technical expertise handson experience and excellent communication skills to successfully deliver enterprisegrade data solutions in Azure/Informatica. This role requires a proactive problem solver who can troubleshoot and optimize complex data pipelines and workflows for maximum efficiency and reliability.

 

Key Responsibilities:

l  Architect and implement enterprise metadatadriven data pipelines using ETL tools like Azure Data Factory (ADF) and Informatica.

l  Design and develop an Operational Data Store (ODS) sourced from Azure Data Lake ensuring robust scalable and highperforming architecture.

l  Collaborate with stakeholders to integrate and optimize Guidewire Data (CDA) into the data lake architecture enabling advanced analytics and reporting.

l  Troubleshoot and resolve issues in data pipelines workflows and related processes to ensure reliability and data accuracy.

l  Continuously monitor and optimize current workflows for performance scalability and costefficiency adhering to best practices.

l  Develop and maintain custom processes using Python TSQL and Spark tailored to business requirements.

l  Leverage Azure Functions to design serverless compute solutions for eventdriven and scheduled data workflows.

l  Optimize data workflows and resource usage to ensure costefficiency in Azure Cloud environments.

l  Provide leadership and guidance for implementing Hadoopbased big data solutions where applicable.

l  Develop a comprehensive understanding of P&C domain data ensuring alignment with business objectives and compliance requirements.

l  Communicate technical solutions effectively with crossfunctional teams stakeholders and nontechnical audiences.

 

Required Qualifications:

l  13 years of experience in data architecture data engineering and/or ETL development roles with at least 3 years in a P&C insurance domain.

l  Proven experience with Azure Cloud Services including Azure Data Lake Azure Data Factory and SQL Server.

l  Leverage Informatica for robust ETL workflows data integration and metadatadriven pipeline automation to streamline data processing

l  Build endtoend metadatadriven frameworks and continuously optimize existing workflows for improved performance scalability and efficiency.

l  Strong knowledge of Guidewire Claim Data Access (CDA) or similar insurance domain data.

l  Expertise in troubleshooting and optimizing data pipelines and workflows for enhanced reliability and performance.

l  Proficiency in scripting and programming with Python TSQL and Spark for custom data workflows.

l  Handson expertise in building and managing ODS systems from data lakes.

l  Experience with Azure Functions for serverless architecture.

l  Familiarity with Hadoop ecosystems (preferred but not mandatory).

l  Demonstrated ability to design solutions for Azure Cloud Cost Optimization.

l  Excellent communication skills to engage with technical and business stakeholders effectively.

l  Experience with metadata management and data cataloging for largescale data ecosystems.

 

Preferred Skills:

l  Familiarity with Guidewire systems and their integration patterns.

l  Experience in implementing Data Governance frameworks.

l  Certification in Azure (e.g. Azure Data Engineer Associate or Azure Solutions Architect).

l  Experience with other data platforms/tools such as Hadoop Databricks etc.

 

Regards

 

Manoj

Derex Technologies INC

Contact : Ext 206


Additional Information :

All your information will be kept confidential according to EEO guidelines.


Remote Work :

No


Employment Type :

Fulltime

Employment Type

Full-time

Company Industry

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.