DataOps Engineer

CoreWeave Europe

Not Interested
Bookmark
Report This Job

profile Job Location:

London - UK

profile Monthly Salary: Not Disclosed
Posted on: 2 days ago
Vacancies: 1 Vacancy

Job Summary

CoreWeave is The Essential Cloud for AI. Built for pioneers by pioneers CoreWeave delivers a platform of technology tools and teams that enables innovators to build and scale AI with confidence. Trusted by leading AI labs startups and global enterprises CoreWeave combines superior infrastructure performance with deep technical expertise to accelerate breakthroughs and turn compute into capability. Founded in 2017 CoreWeave became a publicly traded company (Nasdaq: CRWV) in March 2025. Learn more at.
Were proud to be a Living Wage accredited Employer.

What Youll Do:

The Monolith AI Platform Engineering Team at CoreWeave is responsible for building and scaling the data and workflow backbone that powers the worlds most advanced engineering simulation and AI workflows our ambition is to become the superintelligent AI test lab for the engineering industry helping customers ship science faster. From highthroughput data ingestion and feature pipelines to model training and realtime inference our platform delivers the performant reliable and trustworthy data foundation trusted by the worlds largest engineering companies.

The Senior DataOps Engineer II will own and drive all things data observability and operations across our client estate building the practices tooling and culture that make Monoliths data flows debuggable auditable and safe to evolve. Youll sit at the intersection of platform engineering data engineering and reliability implementing endtoend lineage and DataOps practices while mentoring data producers and consumers on how to manage data as a firstclass product.

Youll partner closely with Monoliths Product Engineering and forwarddeployed teams as well as with CoreWeaves infrastructure and AI platform groups to turn fragmented realworld engineering data into wellgoverned observable and operationally robust pipelines powering our SaaS platform and clientspecific deployments.

About the Role:

Were seeking an Senior DataOps Engineer II who can act as the handson owner for Monoliths data observability and operational surface: from batch and streaming pipelines running on our platform through to the lineage quality and runbooks that keep customer environments healthy.

Youll define and roll out DataOps practices (CI/CD infraascode data SLOs incident response) across the Monolith estate implement endtoend data lineage and observability and serve as the goto mentor for engineering teams and clientfacing colleagues on bestpractice data management.

In this role you will:

  • Own Monoliths Data Observability & Operations Surface
    • Design and implement the endtoend observability stack for data workloads (metrics logs traces and dataquality signals) across batch and streaming pipelines.
    • Define and maintain operational SLOs/SLAs for critical data flows powering training inference and analytics and ensure they are measurable and actionable.
    • Build dashboards alerts and runbooks that allow engineers and oncall responders to quickly detect triage and remediate data incidents.
    • Standardise golden paths for how teams instrument pipelines expose health signals and respond to datarelated failures.
  • Implement Data Lineage Quality & Governance
    • Deploy and maintain endtoend data lineage for key domains from client sources through transformations to features models and downstream analytics so teams can debug audit and reason about change.
    • Define and roll out data quality checks (schema freshness completeness distribution drift) and ensure failures integrate cleanly into alerting and incident workflows.
    • Partner with Security Compliance and customerfacing teams to encode data governance requirements (e.g. retention residency access controls) into our pipelines and tooling.
    • Help shape metadata models and catalog conventions so that producers and consumers can reliably discover understand and use shared datasets.
  • Enable DataOps Practices Across Teams
    • Establish CI/CD patterns for data pipelines and related infrastructure including testing strategies promotion workflows and changemanagement guardrails.
    • Drive adoption of infraascode for data infrastructure (e.g. pipeline orchestration storage observability components) reducing manual drift across environments.
    • Define and continuously improve DataOps processes incident response postincident review change review oncall rotations with a focus on learning rather than blame.
    • Evaluate and integrate bestofbreed DataOps and observability tooling where it accelerates our teams balancing build vs. buy pragmatically.
  • Partner Across Monolith CoreWeave & Clients
    • Work with Monolith platform data agent and reliability teams to expose observability and lineage as shared services and patterns other engineers can build on.
    • Collaborate with CoreWeave infrastructure and AI platform teams to leverage underlying storage compute networking and observability in service of robust data flows.
    • Serve as a technical escalation point for forwarddeployed and customerfacing engineers when data issues cross service boundaries or require deeper architectural insight.
    • Mentor data producers (product teams integrations forwarddeployed engineers) and data consumers (data scientists analysts client engineers) on resilient schemas contracts and operational practices.

Who You Are:

  • Experience & Level
    • Typically 56 years of experience in DataOps Data Engineering DevOps/SRE for data platforms or similar roles including endtoend ownership of production data pipelines and their operations.
    • Proven track record of operating at Senior IC scope: leading crossteam initiatives introducing new practices/tooling and improving reliability at the platform level.
  • DataOps Pipelines & Tooling
    • Strong handson experience designing deploying and operating data pipelines in production (batch and/or streaming) including failure modes retries and backfills.
    • Practical experience with data orchestration and ETL/ELT tooling (e.g. Airflow Dagster dbt Temporal or similar) and comfort evaluating and integrating new tools where appropriate.
    • Solid SQL and/or Spark skills and experience with at least one major analytical database or warehouse; familiarity with timeseries / telemetry data is a plus.
  • Observability Lineage & Data Quality
    • Extensive experience implementing data observability metrics logging tracing dashboards and alerting for datacentric workloads.
    • Handson work with data quality frameworks and/or observability platforms to monitor freshness completeness schema changes and anomalies.
    • Experience deploying and using data lineage or metadata/catalog solutions and applying them to debugging compliance and changeimpact analysis.
  • Platform Infrastructure & Automation
    • Comfortable working in containerised cloudnative environments (Kubernetes plus at least one major cloud provider); experience with GPU or computeintensive workloads is a bonus.
    • Strong automation mindset: infraascode CI/CD and configuration management for data infrastructure and observability components.
    • Proficient in Python for building tooling pipeline glue and platform integrations; additional languages are a plus.
  • Collaboration Mentorship & Communication
    • Clear communicator who can explain complex data flows and failure modes to both deeply technical and nonspecialist audiences.
    • Experience mentoring engineers and data practitioners on better data management observability and operational hygiene through documentation examples reviews and office hours.
    • Comfortable working in a fastmoving highambiguity environment where we balance rapid iteration with the safety and reliability demanded by enterprise engineering clients.

Preferred:

  • Experience in ML/AI platforms or MLOps environments where data pipelines power experimentation training and inference at scale.
  • Background with test simulation or timeseries data (e.g. physical test benches battery labs automotive/aerospace R&D).
  • Familiarity with feature stores experiment tracking or model registries and their interaction with upstream data pipelines.
  • Prior work in multitenant SaaS platforms especially those with strong compliance observability and uptime requirements.
  • Experience supporting or partnering closely with forwarddeployed / professional services teams in complex customer environments.

Wondering if youre a good fit

We believe in investing in our people and value candidates who bring diverse experiences even if you dont tick every single box. Here are a few qualities weve found compatible with our team. If some of this sounds like you wed love to talk:

  • Dataobsessed operator You care deeply about making data systems observable predictable and easy to reason about not just working most of the time.
  • Systems thinker You enjoy mapping complex data flows across services understanding failure modes and designing for graceful degradation and rapid recovery.
  • Pragmatic You know when to build the ideal abstraction and when to ship the smallest change that meaningfully reduces risk or toil.
  • Collaborative mentor You get energy from helping other teams level up their data practices and you can influence without heavy process or authority.
  • Owners mindset You feel responsible for the outcomes of the platform as a whole not just the code you write and you follow issues across boundaries until theyre truly resolved.

Why CoreWeave

At CoreWeave we work hard have fun and move fast! Were in an exciting stage of hypergrowth that you will not want to miss out on. Were not afraid of a little chaos and were constantly learning. Our team cares deeply about how we build our product and how we work together which is represented through our core values:

  • Be Curious at Your Core
  • Act Like an Owner
  • Empower Employees
  • Deliver BestinClass Client Experiences
  • Achieve More Together

We support and encourage an entrepreneurial outlook and independent thinking and foster an environment that encourages collaboration and innovative solutions to complex problems. As we get set for takeoff the organizations growth opportunities are constantly expanding. You will be surrounded by some of the best talent in the industry who will want to learn from you too. Come join us!

To fulfil our obligation to protect client data successful applicants offered employment with CoreWeave will be required to complete a basic criminal record check conducted in compliance with GDPR. Employment offers are conditional upon receiving satisfactory check results

What We Offer

In addition to a competitive salary we offer a variety of benefits to support your needs including:

  • Family-level Medical Insurance
  • Family-level Dental Insurance
  • Generous Pension Contribution
  • Life Assurance at 4x Salary
  • Critical Illness Cover
  • Employee Assistance Programme
  • Tuition Reimbursement
  • Work culture focused on innovative disruption

Benefits may vary by location.

Our Workplace

While we prioritize a hybrid work environment remote work may be considered for candidates located more than 30 miles from an office based on role requirements for specialized skill sets. New hires will be invited to attend onboarding at one of our hubs within their first month. Teams also gather quarterly to support collaboration

CoreWeave is an equal opportunity employer committed to fostering an inclusive and supportive workplace. All qualified applicants and candidates will receive consideration for employment without regard to race color religion sex disability age sexual orientation gender identity national origin veteran status or genetic information.

CoreWeave does not accept speculative CVs. Any unsolicited CVs received will be treated as the property of CoreWeave and your Terms & Conditions associated with the use of CVs will be considered null and void.

Any unsolicited CVs sent by your company to us that is to say in any situation where we have not directly engaged your company in writing to supply candidates for a specific vacancy will be considered by us to be a free gift leaving us liable for no fees whatsoever should we choose to contact the candidate directly and engage the candidates services and will in no way establish any prior claim by your company to representation of that candidate should the candidates details also be submitted by any other party.

Export Control Compliance

This position requires access to export controlled information. To conform to U.S. Government export regulations applicable to that information applicant must either be (A) a U.S. person defined as a (i) U.S. citizen or national (ii) U.S. lawful permanent resident (green card holder) (iii) refugee under 8 U.S.C. 1157 or (iv) asylee under 8 U.S.C. 1158 (B) eligible to access the export controlled information without a required export authorization or (C) eligible and reasonably likely to obtain the required export authorization from the applicable U.S. government agency. CoreWeave may for legitimate business reasons decline to pursue any export licensing process.

Updated privacy notice - UK and EU Job Applications

When you apply to a job on this site the personal data contained in your application will be collected by CoreWeave UK Ltd. (Controller) which is located at

Phosphor (6th Floor) 133 Park Street London SE1 9EA

and can be contacted by emailing . Controllers data protection officer can be contacted at . Your personal data will be processed for the purposes of managing Controllers recruitment related activities which include setting up and conducting interviews and tests for applicants evaluating and assessing the results thereto and as is otherwise needed in the recruitment and hiring processes. Such processing is legally permissible under Art. 6(1)(f) of (i) Regulation (EU) 2016/679 (General Data Protection Regulation (GDPR) and (ii) the GDPR as it forms part of the laws of the UK (UK GDPR) as necessary for the purposes of the legitimate interests pursued by the Controller which are the solicitation evaluation and selection of applicants for employment. Your personal data will be shared with Greenhouse Software Inc. a cloud services provider located in the United States of America and engaged by Controller to help manage its recruitment and hiring process on Controllers behalf. With respect to transfers originating from the UK or the European Economic Area (EEA) to a country outside the UK or the EEA we implement the appropriate transfer mechanism(s) and other appropriate solutions to address cross-border transfers as required by applicable law. You may request a copy of the suitable mechanisms we have in place by contacting us at

Your personal data will be retained by Controller as long as Controller determines it is necessary to evaluate your application for employment. Where permitted by applicable law we may also retain your personal data for a limited period after the recruitment process ends in order to consider you for future job opportunities respond to legal claims or comply with record-keeping obligations. Under the GDPR and the UK GDPR you have the right to request access to your personal data to request that your personal data be rectified or erased and to request that processing of your personal data be restricted. You also have the right to data addition you may lodge a complaint with the relevant supervisory authority: (i) A list of Europes data protection authorities can be found here; and (ii) for the UK this is the Information Commissioners Office.

For additional information please see our .


Required Experience:

IC

CoreWeave is The Essential Cloud for AI. Built for pioneers by pioneers CoreWeave delivers a platform of technology tools and teams that enables innovators to build and scale AI with confidence. Trusted by leading AI labs startups and global enterprises CoreWeave combines superior infrastructure per...
View more view more

Key Skills

  • ASP.NET
  • Health Education
  • Fashion Designing
  • Fiber
  • Investigation