Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailWe are hiring a Data Lead / Data Engineer to lead data integration pipeline development and migration activities for a large-scale public health platform modernization. The role involves building and managing data flows ensuring data quality supporting system interoperability and enabling real-time analytics across a complex ecosystem.
ResponsibilitiesDesign and implement ETL/ELT pipelines for ingesting and transforming structured health data.
Lead migration from legacy Oracle databases to PostgreSQL or Oracle.
Build and maintain workflows using Databricks Elasticsearch and Kibana.
Manage integrations with external systems via REST APIs SFTP and event streaming (Kafka).
Implement data deduplication cleansing and validation processes.
Support rules-based automation through integration with a rule engine.
Ensure compliance with data retention archival and deletion requirements.
Collaborate with cross-functional Agile teams including engineering QA and analysts.
Strong experience with Databricks Elasticsearch Kibana Oracle and PostgreSQL.
Proficiency in REST API integration Kafka and SFTP-based file processing.
Hands-on experience with data deduplication data quality validation and large-scale migrations.
Familiarity with healthcare data formats such as HL7 FHIR and person-matching via master indexes.
Understanding of rules engine integration for error correction and data transformation.
Experience working in Agile environments with large-scale systems.
Full-time