Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailNot Disclosed
Salary Not Disclosed
1 Vacancy
Inspired by faith. Driven by innovation. Powered by humankindness. CommonSpirit Health is building a healthier future for all through its integrated health services. As one of the nations largest nonprofit Catholic healthcare organizations CommonSpirit Health delivers more than 20 million patient encounters annually through more than 2300 clinics care sites and 137 hospital-based locations in addition to its home-based services and virtual care offerings. CommonSpirit has more than 157000 employees 45000 nurses and 25000 physicians and advanced practice providers across 24 states and contributes more than $4.2 billion annually in charity care community benefits and unreimbursed government programs. Together with our patients physicians partners and communities we are creating a more just equitable and innovative healthcare delivery system.
Job Summary / Purpose
A Clinical Data Analytics Engineer is responsible for bridging the gap between data analysis and reporting and the technical implementation with the goal of improving the delivery of healthcare. This role is an expert at integrating and preparing large varied datasets architecting specialized database and computing environments and communicating results. In addition this role also includes the application of standard statistical and visualization techniques to extract insights.
This position must be highly skilled in leveraging data tools & platforms to provide robust and flexible data to analysts program managers and stakeholders in a rapidly changing clinical business environment to improve clinical processes and outcomes. This position will need to understand both the clinical data needs and IT requirements and act as a liaison between clinical data analysts and IT.
Essential Key Job Responsibilities
Minimum Qualifications
Required Education and Experience
Academic:
Bachelors degree in computer science software or computer engineering statistics mathematics or data science or an interdisciplinary degree that combines those disciplines.
Experience:
Minimum 3 years experience in designing and storing/retrieving data from relational database systems (SQL Server Oracle DB2)
Minimum 1 year creating or working with analytic data models such as dimensional relational or data vault
Minimum 2 years experience writing code in data applications or libraries such as Python visual basic or other equivalent applications
Minimum 2 years experience preparing statistical work for presentation before large groups
Experience using Agile project management a plus
Required Minimum Knowledge Skills Abilities and Training
Ability to read and write SQL following strong software development practices and standards.
Foundational knowledge of data modeling including industry standards such as dimensional and relational data warehouse models.
Understanding of master data management slowly-changing dimensions flexible data warehousing patterns and data modeling techniques.
Skilled in the use of one or more relational database systems such as MS Sql Server MySql or Oracle.
Skilled at implementing strategies of data sourcing including architecting data stores developing Extract Transform Load (ETL) pipelines and ensuring data quality.
Ability to generate insights through the rigorous application of standard statistical techniques and software.
Ability to use and learn advanced statistical learning techniques and software to create visualizations that concisely convey correlations trends and distributions
Passion for innovating with new data technologies to drive better and more efficient delivery of insightful data
Ability to complete projects on a timely basis with evidence of creative and critical thinking.
Ability to write clear concise reports and presentations with an ability to orally communicate effectively;
Strong organizational and documentation skills.
Demonstrated ability to work independently and within a team in a fast changing environment with changing priorities and changing time constraints.
Preferred Skills
Experience with GCP suite tools such as BigQuery Google Cloud Storage and Cloud Composer
Understanding of AI tools such as Gemini or ChatGPT.
Understanding of data modeling such as conceptual logical and physical models.
Knowledge of data warehouse designs.
Knowledge of key software and data engineering concepts such as decoupling pipelines code reusability ACID transactions and DataOps
Knowledge of versioning tools such as Github
Unclear