Senior Data Engineer

Hinge Health

Not Interested
Bookmark
Report This Job

profile Job Location:

San Francisco, CA - USA

profile Monthly Salary: $ 164 - 247
Posted on: 20 hours ago
Vacancies: 1 Vacancy

Job Summary

About the Role

As a Senior Data Engineer you will own the data infrastructure that powers real-time experiences for our members. You will build and scale pipelines that move data from dozens of upstream services across Kafka event streams and transactional databases into a unified data platform that serves both real-time APIs and analytical workloads on work will directly enable AI-powered coaching assistants and physical therapy tools that use live member data including engagement logs and clinical data to generate personalized recommendations. You will work at the intersection of data engineering and AI building the reliable low-latency data foundation that these systems depend on.

You will work in a modern stack: Python Flink and PySpark for pipeline development Kafka for event streaming Delta Lake for scalable storage and Aurora PostgreSQL for operational data.

This is a high-ownership role. You will work closely with application engineers data scientists and AI teams across the organization defining how data flows from the moment it is created to the moment it is consumed. You will also help establish the standards and practices that enable product teams to take ownership of their own data in a HIPAA-compliant environment. If you are excited about building the data infrastructure behind AI systems that have a direct impact on peoples health this role is for you.

Our tech stack: Python SQL dbt Airflow PostgreSQL MySQL REST Aptible Docker Terraform Spark Kafka Flink Fivetran Databricks AWS (S3 Lambda Kinesis RDS Glue).

WHAT YOULL ACCOMPLISH

  • Design and build the data foundation for AI-powered health experiences and decision sciences - Build and own data pipelines across both streaming and batch paradigms from ingestion to serving. You will build ingestion layers that consume from Kafka event streams and transactional databases write transformations using Flink Databricks and dbt and make deliberate decisions about where transformations belong and why. You will own pipelines across the full stack: raw ingestion normalized staging aggregated analytical models and the serving layer that downstream consumers including real-time APIs BI tools and AI systems depend on. You will define data contracts own schema evolution and make sure the data you deliver is well-documented and reliable enough for others to build on.

  • Keep the platform reliable and the data trustworthy - Define and track SLAs (Service Level Agreements) and SLOs (Service Level Objectives) for the pipelines you own and hold yourself accountable to them. Partner with the SRE (Site Reliability Engineering) team on monitoring alertinglineage and observability best practices. When pipelines fail or data quality degrades you will lead the response communicate proactively with stakeholders about impact and timelines and drive the systemic fixes not just the
    immediate patch.

  • Deliver with ownership and grit - Take projects from ambiguous requirements to production working through technical blockers cross-functional dependencies and competing priorities without losing momentum. Keep stakeholders informed of progress blockers and expected timelines throughout not just when things go wrong. Build a track record of delivering high-quality data
    assets on time that teams can trust and depend on.

  • Make it easy for teams to own their data - Build tooling for service and application teams that helps them effectively manage their data and data processes. Coach teams on compliance strategies performance tuning event-driven design for data consumption and schema
    evolution so they can take ownership of their data without creating a bottleneck on the data team. Translate business requirements into durable scalable technical solutions and communicate tradeoffs clearly to both technical and
    non-technical stakeholders.

  • Build with compliance and trust at the center - Implement data handling practices that meet HIPAA GDPR and CCPA requirements including PII (Personally Identifiable Information) handling access controls data retention and making production data safely available in non-production environments.

  • Raise the bar for the team around you - Participate in hiring and mentor junior engineers. Review designs give feedback on code contribute to standards and best practices and help teammates work through complex technical problems. Be a continuous learner who brings new ideas and uplifts the people around you.

BASIC QUALIFICATIONS

  • Bachelors Degree in Computer Science or related technical degree

  • 5 years of data engineering experience with a proven track record of building and owning reliable production pipelines

  • 5 years of strong proficiency in Python and SQL

  • 3 years of experience in processing and storing large scale data using distributed systems as well as a mastery of database designs and data modeling including star and snowflake schemas.

  • 2 years of experience working with broad spectrum of data stores like PostgreSQL MySQL MongoDB Redis databricks and Redshift

  • 3 years of experience deploying and operating pipelines in the cloud including CI/CD monitoring and incident response

  • 3 years of experience building streaming and batch pipelines using tools such as Spark Kafka Flink and Airflow

  • 1 year experience working with dbt

  • 1 years of experience with AI tools for code generation (cursor claude code)

  • 1 years of experience with orchestration tools (Airflow Prefect or Dagster) including scheduling retries alerting and SLAs.

Preferred Qualifications

  • Track record of improving data reliability: handling schema drift late data backfills and operational incidents.

  • Experience with big data technologies such as: Hadoop Hive Spark EMR

  • Proven success in communicating with users other technical teams and senior management to collect requirements describe data modeling decisions and data engineering strategy

  • Understanding of MLOps/LLMOps principles to ensure the scalable and reliable deployment of text processing and embedding pipelines

  • Background in analytics engineering (dbt SQLMesh Dataform) or strong understanding of data consumer needs

About Hinge Health

At Hinge Health were using technology to scale and automate the delivery of healthcare starting with musculoskeletal (MSK) conditions which affect over 1.7 billion people worldwide. With an AI-powered human-centered care model Hinge Health leverages cutting-edge technology to improve outcomes experiences and costs to help people move beyond their pain. The platform addresses a broad spectrum of MSK care from acute injury to chronic pain to post-surgical rehabilitation through personalized evidence-based care.

As the preferred partner to 50 health plans PBMs and other ecosystem partners Hinge Health is available to over 20 million people across more than 2550 employers. The company is headquartered in San Francisco with additional offices in Montreal and Bangalore. Learn more at

What Youll Love About Us

  • Inclusive healthcare and benefits: On top of comprehensive medical dental and vision coverage we offer employees and their family members help with gender-affirming care tools for family and fertility planning and travel reimbursements if healthcare isnt available where you live.

  • Planning for the future: Start saving for the future with our traditional or Roth 401k retirement plan options which include a 2% company match.

  • Modern life stipends: Manage your own learning and development

  • Grow with us through discounted company stock through our ESPP with easy payroll deductions.

Culture & Engagement

Hinge Health is an equal opportunity employer and prohibits discrimination and harassment of any kind. We make employment decisions without regards to race color religion sex sexual orientation gender identity national origin age veteran status disability status pregnancy or any other basis protected by federal state or local law. We also consider qualified applicants regardless of criminal histories consistent with legal requirements. We provide reasonable accommodations for candidates with disabilities. If you feel you need assistance or an accommodation due to a disability let us know by reaching out to your recruiter.

By submitting your application you are acknowledging we are using your personal data as outlined in the personnel and candidate privacy policy.


Beware of Phishing Attempts: Weve noticed an increase in phishing where fraudsters impersonate employees and send fake job offers to steal sensitive information. Well never ask for financial details during the hiring process and only use @ emails. If you receive a suspicious offer stop communication and report it to the US FBI Internet Crime Complaint Center. To verify an email from our recruiting team forward it to .


Required Experience:

Senior IC

About the RoleAs a Senior Data Engineer you will own the data infrastructure that powers real-time experiences for our members. You will build and scale pipelines that move data from dozens of upstream services across Kafka event streams and transactional databases into a unified data platform that...
View more view more