drjobs
Data Engineer Epic Cerner Databricks Azure TELECOMMUTE
drjobs Data Engineer Epic Cerner Databricks Azure TELECOMMUTE English

Data Engineer Epic Cerner Databricks Azure TELECOMMUTE

صاحب العمل نشط

1 وظيفة شاغرة
هذا المنشور غير متاح الآن! ربما يكون قد تم شغل الوظيفة.
drjobs

حالة تأهب وظيفة

سيتم تحديثك بأحدث تنبيهات الوظائف عبر البريد الإلكتروني
Valid email field required
أرسل الوظائف
drjobs drjobs drjobs
drjobs drjobs
drjobs

حالة تأهب وظيفة

سيتم تحديثك بأحدث تنبيهات الوظائف عبر البريد الإلكتروني

Valid email field required
أرسل الوظائف

الراتب الشهري

drjobs

لم يكشف

drjobs

لم يتم الكشف عن الراتب

عدد الوظائف الشاغرة

1 وظيفة شاغرة

الوصف الوظيفي

رقم الوظيفة : 2565116

NA% telecommute.

Hours: 9am5pm EST additional overtime and weekends on as needed basis

Team: Architects and 2 Data engineers along with India Team members supporting JMH

Client is seeking a Data Engineer with health care experience to implement and manage crossdomain modular flexible scalable secure reliable and quality data solutions that transform data to support analytics and insight generation for our clients. The Data Engineer will implement test deploy monitor and maintain the delivery of data in a systematic method and will support a wide variety of analytical needs for our customers. The Director of Data Engineering will also partner with the broader OAS Analytics organization to harness the power of client data to facilitate analytical insight and will be responsible for building quality and efficiency into every project.

Responsibilities:

  • Support the full data engineering lifecycle including research proof of concepts design development testing deployment and maintenance of data management solutions
  • Utilize knowledge of various data management technologies to drive data engineering projects
  • Lead data acquisition efforts to gather data from various structured or semistructured source systems of record to hydrate client data warehouse and power analytics across numerous health care domains
  • Leverage combination of ETL/ELT methodologies to pull complex relational and dimensional data to support loading DataMarts and reporting aggregates.
  • Eliminate unwarranted complexity and unneeded interdependencies
  • Detect data quality issues identify root causes implement fixes and manage data audits to mitigate data challenges
  • Implement modify and maintain data integration efforts that improve data efficiency reliability and value
  • Leverage and facilitate the evolution of best practices for data acquisition transformation storage and aggregation that solve current challenges and reduce the risk of future challenges
  • Effectively create data transformations that address business requirements and other constraints
  • Partner with the broader analytics organization to make recommendations for changes to data systems and the architecture of data platforms
  • Support the implementation of a modern data framework that facilitates business intelligence reporting and advanced analytics
  • Prepare high level design documents and detailed technical design documents with best practices to enable efficient data ingestion transformation and data movement.
  • Leverage DevOps tools to enable code versioning and code deployment.
  • Leverage data pipeline monitoring tools to detect data integrity issues before they result into user visible outages or data quality issues
  • Leverage processes and diagnostics tools to troubleshoot maintain and optimize solutions and respond to customer and production issues
  • Continuously support technical debt reduction process transformation and overall optimization
  • Leverage and contribute to the evolution of standards for high quality documentation of data definitions transformations and processes to ensure data transparency governance and security
  • Ensure that all solutions meet the business needs and requirements for security scalability and reliability

Ideal Background: Healthcare background including Azure ADF and Databricks background.

Required:

  • Bachelors Degree (preferably in information technology engineering math computer science analytics engineering or other related field)
  • Minimum of 5 years of combined experience in data engineering ingestion normalization transformation aggregation structuring and storage
  • Minimum of 5 years of combined experience working with industry standard relational dimensional or nonrelational data storage systems
  • Minimum of 5 years of experience in designing ETL/ELT solutions using tools like Informatica DataStage SSIS PL/SQL TSQL etc.
  • Minimum of 5 years of experience in managing data assets using SQL Python Scala VB.NET or other similar querying/coding language
  • Minimum of 3 years of experience working with healthcare data or data to support healthcare organizations

Preferred:

  • 5 years of experience in creating Source to Target Mappings and ETL design for integration of new/modified data streams into the data warehouse/data marts
  • Minimum of 2 years of experience with Epic Clarity and/or Caboodle data models or with Cerner Millennium / HealthEintent and experience using Cerner CCL
  • 2 years of experience working with Health Catalyst product offerings including data warehousing solutions knowledgebase and analytics solutions
  • Epic certifications in one or more of the following modules: Caboodle EpicCare Grand Central Healthy Planet HIM Prelude Resolute Tapestry or Reporting Workbench
  • Experience in Unix or Powershell or other batch scripting languages
  • Depth of experience and proven track record creating and maintaining sophisticated data frameworks for healthcare organizations
  • Experience supporting data pipelines that power analytical content within common reporting and business intelligence platforms (e.g. Power BI Qlik Tableau MicroStrategy etc.)
  • Experience supporting analytical capabilities inclusive of reporting dashboards extracts BI tools analytical web applications and other similar products
  • Exposure to Azure AWS or google cloud ecosystems
  • Exposure to Amazon Redshift Amazon S3 Hadoop HDFS Azure Blob or similar big data storage and management components
  • Desire to continuously learn and seek new options and approaches to business challenges
  • A willingness to leverage best practices share knowledge and improve the collective work of the team
  • Experience contributing to crossfunctional efforts with proven success in creating healthcare insights
  • Ability to effectively communicate concepts verbally and in writing
  • Experience and credibility interacting with analytics and technology leadership teams
  • Willingness to support limited travel up to 10%


Required Skills : Cernerepic
Additional Skills : EHR AnalystThis is a high PRIORITY requisition. This is a PROACTIVE requisition

نوع التوظيف

دوام كامل

المهارات المطلوبة

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • البيانات الضخمة
  • مستودع البيانات
  • Kafka
  • Scala

نبذة عن الشركة

الإبلاغ عن هذه الوظيفة
إخلاء المسؤولية: د.جوب هو مجرد منصة تربط بين الباحثين عن عمل وأصحاب العمل. ننصح المتقدمين بإجراء بحث مستقل خاص بهم في أوراق اعتماد صاحب العمل المحتمل. نحن نحرص على ألا يتم طلب أي مدفوعات مالية من قبل عملائنا، وبالتالي فإننا ننصح بعدم مشاركة أي معلومات شخصية أو متعلقة بالحسابات المصرفية مع أي طرف ثالث. إذا كنت تشك في وقوع أي احتيال أو سوء تصرف، فيرجى التواصل معنا من خلال تعبئة النموذج الموجود على الصفحة اتصل بنا