drjobs Data Engineer IV

Data Engineer IV

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Cranberry Township, PA - USA

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Description

Engineer IV Software Big Data

Omnicell is the world leader in pharmacy robotics and were expanding beyond inventory management into inventoryanalytics. The Omnisphere helps hospitals and health systems understand how meds flow through their business from the loading dock to the nurses glove and then apply clinical expertise and advanced machine learning to uncover opportunities to adjust that flow to improve safety cost efficiency and patient outcomes. And the next step for us is to help busy cliniciansacton those opportunities by building efficient industry-leading workflows.

To do that we take terabytes of data from thousands of devices and translate it to simple actionable steps our clients can take to improve their overall performance. This is achieved through a sleek new microservices architecture primarily composed of Kafka Spark PostgreSQL .NET Core and Angular all running in AWS.

Responsibilities:

  • Translate business requirements into effective technology solutions
  • Help lead the design architecture and development of the Omnicell Data Platform
  • Conduct design and code reviews
  • Resolve defects/bugs during QA testing pre-production production and post-release patches
  • Analyze and improve efficiency scalability and stability of various system resources once deployed
  • Provide technical leadership to agile teams onshore and offshore: Mentor junior engineers and new team members and apply technical expertise to challenging programming and design problems
  • Help define the technology roadmap that will support the product development roadmap
  • Continue to improve code quality by tracking reducing and avoiding technical debt
  • Analyze performance and cost of data processing
  • Integrate new technologies into the data platform and new features of Databricks/Spark
  • Focus on always putting the customer first.

Required Knowledge and Skills

  • Deep development experience of distributed/scalable systems and high-volume transaction applications participating in architecting big data projects
  • Hands-on programming experience in Scala Python and other object-oriented programming languages.
  • Expert in using Big data technologies like Apache Kafka Apache Spark Real Time streaming Structured Streaming Delta lake
  • Excellent analytical and problem-solving skills.
  • Energetic motivated self-starter that is eager to excel with excellent inter-personal skills.
  • Expert in knowing a balance driving the right architecture but realizing the realities of having customers and the need to ship software

Basic Requirements:

  • Education:
    • Bachelors degree preferred; may consider relevant experience in lieu of a degree
    • 8 years experience in software engineering with a degree; 12 years experience in software engineering in lieu of a degree
  • Experience developing ETL processing flows with MapReduce technologies like Spark and Hadoop
  • Experience developing with ingestion and clustering frameworks such as Kafka Zookeeper YARN

Preferred Knowledge and Skills:

  • Masters degree
  • Hands-on working experience in cloud infrastructure like AWS. Able to scale cade and deploy applications in the public cloud using technologies like AWS Lambda Docker Kubernetes.
  • Experience with Big Data ML toolkits such as Mahout SparkML or H2O
  • Experience working with healthcare specific data exchange formats including HL7 and FHIR.
  • Experience with building stream-processing systems using solutions such as Storm or Spark-Streaming
  • Experience with various messaging systems such as Kafka or RabbitMQ
  • Working knowledge of Databricks Team Foundation Server TeamCity Octopus deploys and DataDog

Work Conditions:

  • Team collaboration hours are between 8am and 4pm EST.
  • Remote environment.
  • Ability to travel 10% of the time.

Additional Requirements

  • Legally authorized to work in the United States without Omnicell sponsorship now or in the future
  • Ability to pass background and employment verification checks


Employment Type

Full-Time

Company Industry

Department / Functional Area

Engineering

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.