drjobs Data Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Princeton, NJ - USA

Hourly Salary drjobs

$ 70 - 80

Vacancy

1 Vacancy

Job Description

6 Month Contract Role

Hybrid - 3 Days Onsite at Princeton NJ site

Experience in implementing Microsoft BI/Azure BI solutions like Azure Data Factory Azure Databricks Azure Analysis Services SQL Server Integration Services SQL Server Reporting Services. Strong Understanding in Azure Big data technologies like Azure Data Lake Analytics Azure Data Lake Store Azure Data Factory and in moving the data from flat files and SQL Server using U-SQL jobs.

  • Expert in data warehouse development starting from inception to implementation and ongoing support strong understanding of BI application design and development principles using Normalization and De-Normalization techniques. Experience in developing staging zone bronze silver and gold layers of data
  • Good knowledge in implementing various business rules for Data Extraction Transforming and Loading (ETL) between Homogenous and Heterogeneous Systems using Azure Data Factory (ADF).
  • Developed notebooks for moving data from raw to stage and then to curated zones using Databricks.
  • Involved in developing complex Azure Analysis Services tabular databases and deploying the same in Microsoft azure and scheduling the cube through Azure Automation Runbook.
  • Extensive experience in developing tabular and multidimensional SSAS Cubes Aggregation KPIs Measures Partitioning Cube Data Mining Models deploying and Processing SSAS objects.

Domain Knowledge (Preferred)

Experience with actuarial tools or insurance is preferred. The intent is familiarity with the data terminologies and the hierarchy of data in the Insurance domain specifically in the below areas

  • Familiarity with reinsurance broking data including placements treaty structures client hierarchies and renewal workflows.
  • Understanding of actuarial rating inputs and outputs including exposure and experience data layers tags and program structures.
  • Experience building data pipelines that support actuarial analytics pricing tools and downstream reporting for brokers and clients.

Team skills

  • Team builder with strong analytical & interpersonal skills with good knowledge in Software Development Life Cycle (SDLC) and Proficient in technical writing.
  • Experience in Agile software development and SCRUM methodology.
  • Ability to work independently and as part of a team to accomplish critical business objectives as well as good decision-making skills under high pressure complex scenarios

Technical Skills:

Business Intelligence:Azure Data Factory (ADF) Azure Databricks Azure Analysis Services (SSAS) Azure Data Lake Analytics Azure Data Lake Store (ADLS) Azure Integration Runtime Azure Event Hubs Azure Stream Analytics DBT

Database Technologies:Azure SQL MongoDB PySpark


Must Have:
  • Experience in implementing Microsoft BI/Azure BI solutions like Azure Data Factory Azure Databricks Azure Analysis Services SQL Server Integration Services SQL Server Reporting Services. Strong Understanding in Azure Big data technologies like Azure Data Lake Analytics Azure Data Lake Store Azure Data Factory and in moving the data from flat files and SQL Server using U-SQL jobs.
  • Expert in data warehouse development starting from inception to implementation and ongoing support strong understanding of BI application design and development principles using Normalization and De-Normalization techniques. Experience in developing staging zone bronze silver and gold layers of data
  • Good knowledge in implementing various business rules for Data Extraction Transforming and Loading (ETL) between Homogenous and Heterogeneous Systems using Azure Data Factory (ADF).
  • Developed notebooks for moving data from raw to stage and then to curated zones using Databricks.
  • Involved in developing complex Azure Analysis Services tabular databases and deploying the same in Microsoft azure and scheduling the cube through Azure Automation Runbook.
  • Extensive experience in developing tabular and multidimensional SSAS Cubes Aggregation KPIs Measures Partitioning Cube Data Mining Models deploying and Processing SSAS objects.

Domain Knowledge (Preferred)

  • Experience with actuarial tools or insurance is preferred. The intent is familiarity with the data terminologies and the hierarchy of data in the Insurance domain specifically in the below areas
  • Familiarity with reinsurance broking data including placements treaty structures client hierarchies and renewal workflows.
  • Understanding of actuarial rating inputs and outputs including exposure and experience data layers tags and program structures.
  • Experience building data pipelines that support actuarial analytics pricing tools and downstream reporting for brokers and clients.

Employment Type

Full Time

Company Industry

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.