drjobs Sr Data Warehouse Engineer

Sr Data Warehouse Engineer

Employer Active

1 Vacancy
The job posting is outdated and position may be filled
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Denver, CO - USA

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Position Overview

We are looking for a highly skilled data warehouse engineer with exceptional ETL skills to play a crucial role in designing and implementing cuttingedge data pipeline infrastructures and models while providing essential support for our data warehouse. Success in this role will be fueled by expertise of data pipelines data warehousing and excellent communication skills. In addition to your core responsibilities you may take on tasks such as designing and implementing data models developing and testing data integration processes and collaborating with fellow data professionals to guarantee data quality.

 

Responsibilities:

  • Develop and maintain data pipelines APIbased or filebased data flows between source systems and the data warehouse
  • Use innovative tools and techniques to automate common data preparation and integration tasks with the goal of reducing defects and ensuring data quality
  • Implement best practices to ensure the integrity of data with exceptionhandling routines
  • Provides source to target mapping development support and maintenance.
  • Lead troubleshooting efforts and formulate interdisciplinary task force groups for ETL issues
  • Design develop and deploy data structures and data transformations in the enterprise data warehouse using Python SSIS ADF  
  • Maintain and extend Epic Caboodle platform and develop custom Caboodle data modeling components 
  • Form relationships and coordinate with business stakeholders to identify data needs clarify requirements and implement solutions
  • Contribute to the departments shortterm and longterm strategic plan
  • Make appropriate recommendations on management of data extraction and analysis
  • Maintain knowledge of the current regulations and technologies related to data management
  • Assist with data governance initiatives in the areas of data quality data security metadata and master data management
  • Actively contribute to all aspects of the data project lifecycle including request intake and acknowledgment project estimation timetracking and prioritization of tasks.
  • Be an exemplary team player with excellent collaboration skills
  • Exhibit outstanding customer service skills with stakeholders
  • Perform other duties as required or assigned.

Qualifications :

Experience

  • 7 years of experience as a Data Engineer
  • Indepth knowledge of SQL data warehouses and data transformation techniques
  • Proven experience with designing and building data pipelines
  • Expert knowledge of metadata management and related tools
  • Advanced knowledge of data ETL concepts processes and tools such as MS SSIS ADF
  • Advanced knowledge of Python
  • Ability to read and understand various data structures
  • Ability to work independently and as part of a team
  • Strong analytical technical and troubleshooting skills
  • Ability to assess requirements from multiple sources and their impact on potential solutions
  • Ability to work in a complex environment
  • Ability to be organized and proficient at tracking tasks defining next steps and following project plans
  • Advanced knowledge of database and data warehousing concepts including data lakes relational and dimensional database design concepts and data modeling practices
  • Intermediate knowledge of Jupyter Notebooks
  • Familiarity with Agile project management methods such as SCRUM Lean and/or Kanban
  • Advanced knowledge of healthcare data structures workflows and concepts from Electronic Health Record systems like Epic
  • Knowledge of Azure cloud platform Fabric data platform ADF and DevOps is highly preferred

 

Education

  • Bachelors degree in a technical scientific and/or healthcare discipline; or equivalent work experience.


Additional Information :

Licensure/Certifications

  • Epic Cogito Clarity and Caboodle certifications are required within 120 days of hire
  • All certifications must be maintained throughout employment

 

Additional Information
Hours: Must be able to accommodate pacific time zone hours
Location: Remote


Remote Work :

Yes


Employment Type :

Fulltime

Employment Type

Remote

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.