Data/AI/ML Integration Developer
Location: New York NY
Role Summary:
Create secure scalable integration pipelines and APIs connecting clinical systems Snowflake data stores and AI model endpoints.
Responsibilities:
- Develop APIs and microservices in Python (FastAPI) and
- Build ETL/ELT pipelines using Airflow/dbt from Epic EHR into Snowflake or similar
- Integrate with real-time streaming tools (Kafka GCP Pub/Sub) and FHIR APIs
- Ensure encryption in transit data lineage and access control compliance (OAuth2 RBAC)
- Build hybrid search and RAG workflows across clinical corpora
- Manage vector database operations (e.g. Pinecone/Weaviate) for GenAI search augmentation
- Integrate with orchestration layers for multi-agent systems using LangChain CrewAI
Required Qualifications:
- 4 years in backend/data integration roles
- Hands-on with FHIR/HL7 APIs dbt Snowflake and containerized services
Preferred Qualifications:
- Healthcare interoperability experience
- Integration of LLM endpoints (OpenAI Vertex AI) and model-serving APIs