drjobs
Sr Data Engineer Azure Snowflake 2 openings TELECOMMUTE
drjobs
Sr Data Engineer Azu....
Vision It US
drjobs Sr Data Engineer Azure Snowflake 2 openings TELECOMMUTE العربية

Sr Data Engineer Azure Snowflake 2 openings TELECOMMUTE

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs

Job Location

drjobs

- USA

Monthly Salary

drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Req ID : 2621672

100% telecommute.

Description: You will put your Data Engineering skills to work as you empower business partners and team members improve healthcare delivery. You will research cutting edge big data tools and design innovative solutions to solve business problems that only a Data Engineer can. Youll be in the drivers seat on vital projects that have strategic importance to our mission of helping people live healthier lives. Yes we share a mission that inspires. And we need your organizational talents and business discipline to help fuel that mission.

You will be part of the team who is focused on building a cuttingedge data analytics platform to support reporting requirements for the business. As a Senior Data Engineer you will be responsible for the development of complex data sources and pipelines into our data platform (i.e. Snowflake) along with other data applications (i.e. Azure Airflow etc.) and automation.

This is a fully remote role based in the United States. Your counterpart team is located in Dublin Ireland office. While there is no requirement to work in shift hours there might be an occasional call with Dublin team which can require flexible working.

Responsibilities:
Create & maintain data pipelines using Azure & Snowflake as primary tools
Create SQL Stored procs Macros to perform complex transformation
Creating logical & physical data models to ensure data integrity is maintained
CI CD pipeline creation & automation using GIT & GIT Actions
Tuning and optimizing data processes
Design and build best in class processes to clean and standardize data
Code Deployments to production environment troubleshoot production data issues
Modelling of big volume datasets to maximize performance for our BI & Data Science Team
Create Docker images for various applications and deploy them on Kubernetes

Required:
Computer Science bachelors degree or similar
Min 36 years of industry experience as a Handson Data engineer
Excellent communication skills
Excellent knowledge of SQL Python
Excellent knowledge of Azure Services such as Blobs Functions Azure Data Factory Service Principal Containers Key Vault etc.
Excellent knowledge of Snowflake Architecture best practices
Excellent knowledge of Data warehousing & BI Solutions
Excellent Knowledge of change data capture (CDC) ETL ELT SCD etc.
Knowledge of CI CD Pipelines using GIT & GIT Actions
Knowledge of different data modelling techniques such as Star Schema Dimensional models Data vault
Hands on experience on the following technologies:
o Developing data pipelines in Azure & snowflake
o Writing complex SQL queries
o Building ETL/ELT/data pipelines using SCD logic
o Exposure to Kubernetes and Linux containers (i.e. Docker)
o Related/complementary opensource software platforms and languages (e.g. Scala Python Java Linux)
Previous experience with Relational Databases (RDBMS) & Non Relational Database
Analytical and problemsolving experience applied to a Big Data datasets
Good understanding of Access control and Data masking
Experience working in projects with agile/scrum methodologies and high performing team(s)
Exposure to DevOps methodology
Data warehousing principles architecture and its implementation in large environments
Very good understanding of integration with Tableau

Preferred:
Design and build data pipelines (in Spark) to process terabytes of data
Very good understanding of Snowflake integration with data visualization tool such as Tableau
Orchestrate in Airflow the data tasks to run on Kubernetes/Hadoop for the ingestion processing and cleaning of data
Terraform knowledge and automation
Create realtime analytics pipelines using Kafka / Spark Streaming
Work on Proof of Concepts for Big Data and Data Science
Understanding of United States Healthcare data


Required Skills : SnowflakeAzure
Additional Skills : Azure EngineerThis is a high PRIORITY requisition. This is a PROACTIVE requisition

Employment Type

Full Time

Company Industry

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.