Job Title: Data Science & Research Engineer ML Ops & Research Integration (Contract)
Location (On-site Remote or Hybrid): Fairfax VA (Remote)
Contract Duration: Contract until 12/31/2026
Job Summary:
Senior BI/analytics engineering role supporting predictive model delivery and research integrations. The role blends a Senior BI Developer profile with modern tooling building governed data products on Databricks developing DBT transformation layers and delivering Power BI semantic models/dashboards alongside reusable integration components for ML Ops workflows. Work is focused on development-stage build and initial validation required for controlled go-live.
Key Responsibilities:
- Develop and enhance data/analytics products that operationalize model inputs/outputs using DBT and Databricks (notebooks/jobs Delta patterns).
- Develop and enhance data/analytics products that operationalize model inputs/outputs using DBT and Databricks (notebooks/jobs Delta patterns).
- Build Power BI datasets/semantic models and dashboards tied to approved deliverables; implement refresh security and performance best practices.
- Deliver predictive model and integration enhancements (Python/SQL) as part of defined build scope; support initial validation/testing evidence.
- Align models for ingestion into Signal1 when applicable (outputs/metadata/versioning) and coordinate ingestion requirements with Signal1.
- Build monitoring enablement as part of defined enhancements (monitor definitions thresholds alerting hooks baseline calculations) and complete initial validation/sign-off.
- Build or enhance ML Ops workflows (deployment automation reproducible pipelines) and provide initial validation/testing required for release readiness.
- Develop APIs and connectors for research platforms and datasets as reusable integration assets.
- Produce technical documentation (interfaces pipeline specs metric definitions deployment notes) supporting traceability and controlled release.
- Deliver predictive model and integration enhancements (Python/SQL) as part of defined build scope; support initial validation/testing evidence.
- Build or enhance ML Ops workflows (deployment automation reproducible pipelines) and provide initial validation/testing required for release readiness.
- Develop APIs and connectors for research platforms and datasets as reusable integration assets.
- Produce technical documentation (interfaces pipeline specs metric definitions deployment notes) supporting traceability and controlled release.
- Build or enhance ML Ops workflows (deployment automation reproducible pipelines connector patterns) and provide initial validation/testing evidence.
- Develop APIs and connectors to integrate research platforms and datasets as reusable assets.
- Partner with Data Science and platform owners to ensure integration designs meet governance security and release readiness requirements.
- Produce technical documentation (interface definitions pipeline specs deployment notes) supporting traceability and controlled release.
Required Qualifications:
- 5 years of experience delivering BI/analytics solutions and/or data engineering pipelines in production.
- Strong Python and SQL skills; experience building reliable testable data transformations.
- Hands-on experience with Databricks (notebooks/jobs) and Lakehouse concepts (e.g. Delta).
- Hands-on experience with DBT (models tests documentation) and governed transformation patterns.
- Hands-on experience with Power BI (semantic models/datasets dashboard development refresh/security patterns).
- Experience with version control code review and CI/CD-aligned development practices.
- Strong Python skills and comfort working with APIs and data services.
- Solid SQL skills and experience validating data outputs against acceptance criteria.
- Experience with version control code review and CI/CD-aligned development practices.
Preferred Qualifications:
- Experience with ML lifecycle or analytics governance (feature store concepts model monitoring inputs KPI catalogs).
- Experience integrating with managed ML platforms (e.g. DataRobot) or similar deployment environments.
- Experience designing APIs/integration patterns in enterprise environments.
- Familiarity with regulated or governed environments requiring documentation and change control.
- Experience integrating with managed ML platforms (e.g. DataRobot) or similar model deployment environments.
- Familiarity with data governance and controlled release processes in enterprise settings.
Deliverables / Success Measures: Delivered Databricks/DBT data products aligned to scope; Power BI semantic models/dashboards delivered with documentation and security patterns; reusable APIs/connectors for research integrations; initial validation/testing evidence supporting go-live; reduced cycle time for model and research integrations.
Job Title: Data Science & Research Engineer ML Ops & Research Integration (Contract) Location (On-site Remote or Hybrid): Fairfax VA (Remote) Contract Duration: Contract until 12/31/2026 Job Summary: Senior BI/analytics engineering role supporting predictive model delivery and research inte...
Job Title: Data Science & Research Engineer ML Ops & Research Integration (Contract)
Location (On-site Remote or Hybrid): Fairfax VA (Remote)
Contract Duration: Contract until 12/31/2026
Job Summary:
Senior BI/analytics engineering role supporting predictive model delivery and research integrations. The role blends a Senior BI Developer profile with modern tooling building governed data products on Databricks developing DBT transformation layers and delivering Power BI semantic models/dashboards alongside reusable integration components for ML Ops workflows. Work is focused on development-stage build and initial validation required for controlled go-live.
Key Responsibilities:
- Develop and enhance data/analytics products that operationalize model inputs/outputs using DBT and Databricks (notebooks/jobs Delta patterns).
- Develop and enhance data/analytics products that operationalize model inputs/outputs using DBT and Databricks (notebooks/jobs Delta patterns).
- Build Power BI datasets/semantic models and dashboards tied to approved deliverables; implement refresh security and performance best practices.
- Deliver predictive model and integration enhancements (Python/SQL) as part of defined build scope; support initial validation/testing evidence.
- Align models for ingestion into Signal1 when applicable (outputs/metadata/versioning) and coordinate ingestion requirements with Signal1.
- Build monitoring enablement as part of defined enhancements (monitor definitions thresholds alerting hooks baseline calculations) and complete initial validation/sign-off.
- Build or enhance ML Ops workflows (deployment automation reproducible pipelines) and provide initial validation/testing required for release readiness.
- Develop APIs and connectors for research platforms and datasets as reusable integration assets.
- Produce technical documentation (interfaces pipeline specs metric definitions deployment notes) supporting traceability and controlled release.
- Deliver predictive model and integration enhancements (Python/SQL) as part of defined build scope; support initial validation/testing evidence.
- Build or enhance ML Ops workflows (deployment automation reproducible pipelines) and provide initial validation/testing required for release readiness.
- Develop APIs and connectors for research platforms and datasets as reusable integration assets.
- Produce technical documentation (interfaces pipeline specs metric definitions deployment notes) supporting traceability and controlled release.
- Build or enhance ML Ops workflows (deployment automation reproducible pipelines connector patterns) and provide initial validation/testing evidence.
- Develop APIs and connectors to integrate research platforms and datasets as reusable assets.
- Partner with Data Science and platform owners to ensure integration designs meet governance security and release readiness requirements.
- Produce technical documentation (interface definitions pipeline specs deployment notes) supporting traceability and controlled release.
Required Qualifications:
- 5 years of experience delivering BI/analytics solutions and/or data engineering pipelines in production.
- Strong Python and SQL skills; experience building reliable testable data transformations.
- Hands-on experience with Databricks (notebooks/jobs) and Lakehouse concepts (e.g. Delta).
- Hands-on experience with DBT (models tests documentation) and governed transformation patterns.
- Hands-on experience with Power BI (semantic models/datasets dashboard development refresh/security patterns).
- Experience with version control code review and CI/CD-aligned development practices.
- Strong Python skills and comfort working with APIs and data services.
- Solid SQL skills and experience validating data outputs against acceptance criteria.
- Experience with version control code review and CI/CD-aligned development practices.
Preferred Qualifications:
- Experience with ML lifecycle or analytics governance (feature store concepts model monitoring inputs KPI catalogs).
- Experience integrating with managed ML platforms (e.g. DataRobot) or similar deployment environments.
- Experience designing APIs/integration patterns in enterprise environments.
- Familiarity with regulated or governed environments requiring documentation and change control.
- Experience integrating with managed ML platforms (e.g. DataRobot) or similar model deployment environments.
- Familiarity with data governance and controlled release processes in enterprise settings.
Deliverables / Success Measures: Delivered Databricks/DBT data products aligned to scope; Power BI semantic models/dashboards delivered with documentation and security patterns; reusable APIs/connectors for research integrations; initial validation/testing evidence supporting go-live; reduced cycle time for model and research integrations.
View more
View less