Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailWorks with MCP Director of Data Asset Management on executing MCP data roadmap. Develops and deploys data pipelines and data warehouses to support MCP Data & Analytics operations. Uses various open-source programming languages Google Cloud technologies machine learning models and vended software to meet data needs of MCP products. The position requires maintaining an understanding of the Mayos data policies enterprise data warehouse and regularly requires the application of independent judgment. Demonstrated experience in designing building and installing data systems and how they are applied to support Mayo Clinic Platform products is required. Builds data pipelines to handle large volumes of clinical data including EHR data DICOM data and Genomic data..
A Bachelors degree in a relevant field such as engineering mathematics computer science information technology health science or other analytical/quantitative field and a minimum of five years of professional or research experience in data visualization data engineering analytical modeling techniques; OR an Associates degree in a relevant field such as engineering mathematics computer science information technology health science or other analytical/quantitative field and a minimum of seven years of professional or research experience in data visualization data engineering analytical modeling -depth business or practice knowledge will also be considered.
Incumbent must have the ability to manage a varied workload of projects with multiple priorities and stay current on healthcare trends and enterprise changes. Interpersonal skills time management skills and demonstrated experience working on cross functional teams are required. Requires strong analytical skills and the ability to identify and recommend solutions and a commitment to customer service. The position requires excellent verbal and written communication skills attention to detail and a high capacity for learning and problem resolution. Advanced experience in SQL is required. Strong Experience in programming languages such as Python JavaScript PHP C or Java & API integration is required. Experience with data warehouse technologies such as BigQuery DB2 and RDBMS is required. Experience in hybrid data processing methods (batch and streaming) such as Apache Spark GC Dataflow and GC Data Fusion is required. Experience with big data statistics and machine learning is required. The ability to navigate linux and windows operating systems is required. Knowledge of workflow scheduling (Apache Airflow Google Composer) Infrastructure as code (Terraform Kubernetes Docker) CI/CD (Jenkins Github Actions) is preferred. Experience agile methodologies is preferred. Working knowledge of Tableau Power BI SAS and SSIS is preferred.
Required Experience:
Senior IC
Full-Time