Senior Staff Software Developer FinOps Cloud Development Platform

ServiceNow

Not Interested
Bookmark
Report This Job

profile Job Location:

Pleasanton, CA - USA

profile Monthly Salary: Not Disclosed
Posted on: 30+ days ago
Vacancies: 1 Vacancy

Job Summary

Team

Join the Global Cloud Services organizations FinOps Tools team which is building ServiceNows next-generation analytics and financial governance platform. Our team owns the full modern data stack: Trino for distributed queries dbt for transformations Iceberg for lakehouse architecture Lightdash for business intelligence and Argo Workflows for orchestration. You will be the founding engineer dedicated to building the Cloud Development Platform that empowers our 30 data practitioners (data scientists analysts and FinOps engineers) to collaborate and productionize analytics at scale.

Role

We are building a cloud-native data development platform that bridges the gap between exploratory analysis and production-grade workflows. As our founding Staff Software Developer focused on Cloud Development Infrastructure you will design architect and rapidly implement a platform built on VS Code Coder and Jupyter that seamlessly integrates with our existing data stack (Trino dbt Iceberg Lightdash Argo Workflows).

You will establish opinionated automated pathways from notebook experimentation to production pipelines moving at startup speed within an enterprise environment. This role demands aggressive execution: working prototype in 3 months production-ready platform in 6 months.

This is a unique opportunity to build from the ground up and define how data development happens at ServiceNows scale.

What you get to do in this role: 

Technical Leadership & Architecture

  • Design and architect the foundational cloud development platform for notebook-based data workflows
  • Lead technical decision-making on workspace provisioning developer experience and productionization pathways
  • Establish best practices for notebook-to-production workflows including git integration parameterization validation and automated deployment
  • Drive innovation in data development platforms leveraging AI/ML tools for enhanced developer productivity
  • Move fast: deliver working MVP in 3 months production system scale in 6 months

Hands-On Development

  • Build and customize cloud workspace infrastructure using Coder (open source) on Kubernetes
  • Develop VS Code extensions (TypeScript) for productionization workflows: notebook validation parameterization and Argo Workflow generation
  • Implement opinionated notebook templates and validation rules for production-ready data pipelines
  • Create seamless integrations between notebooks and ServiceNows data stack: Trino queries Iceberg table outputs Lightdash previews dbt transformations
  • Build backend services (Python) for workflow orchestration notebook parsing and metadata management
  • Deploy JupyterHub initially then progressively replace components with custom platform features based on user feedback

Platform Foundation

  • Design container images with embedded security policies pre-configured data access to Trino/Iceberg tables and optimized dependencies
  • Implement git-native workflows with automated notebook versioning code review integration and CI/CD pipelines
  • Build observability and monitoring for workspace health user activity and pipeline success rates
  • Establish infrastructure foundation that scales from 5 early adopters to 30 practitioners within first year

Developer Experience & Automation

  • Create template-based notebook workflows with opinionated structure: parameterization (Papermill-style) Iceberg table outputs validation checkpoints
  • Build CLI and UI tooling for one-click productionization: notebook Argo Workflow with minimal manual intervention
  • Establish developer guardrails: credential management data access policies resource quotas
  • Collaborate closely with early adopter data scientists to rapidly iterate on workflows and validate usability
  • Prioritize platform stability and clear productionization paths over feature breadth in first 6 months

AI-Driven Development

  • Leverage cutting-edge AI development tools (e.g.. Cursor Windsurf ChatGPT GitHub Copilot) to accelerate development velocity
  • Establish AI-augmented development practices and mentor future team members on effective AI tool utilization
  • Drive innovation in AI-assisted code generation testing and platform optimization

Collaboration & Integration

  • Work autonomously with guidance from Engineering and FinOps leadership
  • Collaborate with DevOps team on Kubernetes infrastructure CI/CD pipelines and security policies
  • Partner with FinOps Tools team members working on Trino dbt Lightdash and Iceberg to ensure seamless integrations
  • Contribute to open-source projects in the notebook and developer tooling ecosystem

Qualifications :

To be successful in this role you have:

  • Experience in leveraging or critically thinking about how to integrate AI into work processes decision-making or problem-solving. This may include using AI-powered tools automating workflows analyzing AI-driven insights or exploring AIs potential impact on the function or industry.
  • 12 years of experience in software engineering with deep expertise in full-stack development and cloud-native architecture with a Bachelors degree; or 8 years and a Masters degree; or a PhD with 5 years experience in Computer Science Engineering or related technical field; or equivalent experience.
  • Strong Python skills for backend services API development and data tooling (notebook parsing workflow generation)
  • Proven track record of rapid execution in greenfield environments with evolving requirements
  • Hands-on experience building and scaling developer platforms or internal tools at enterprise scale
  • Deep understanding of cloud development environments (Coder GitHub Codespaces Gitpod or similar)
  • Strong Kubernetes and containerization expertise for cloud-native application deployment
  • Experience with data workflows and tooling: Jupyter notebooks orchestration systems (Airflow/Argo) data catalogs
  • Full professional proficiency in English

Technical Expertise

  • VS Code ecosystem: Extension API webview development command palette language servers debugger protocols
  • Coder or similar platforms: Workspace provisioning remote development environments infrastructure customization
  • Jupyter ecosystem: JupyterHub Jupyter Server Papermill nbconvert or similar notebook tooling
  • Kubernetes & containerization: Pod management custom resource definitions Helm charts image security
  • Infrastructure as Code: Terraform Kubernetes operators GitOps workflows
  • Git workflows: Branching strategies code review automation CI/CD integration
  • Modern data stack: Familiarity with Trino dbt Iceberg Argo Workflows or similar technologies
  • API design: RESTful services authentication (OAuth/SAML) webhook integrations

Platform Engineering & Developer Experience

  • Proven track record building internal developer platforms or productivity tools from scratch
  • Experience designing opinionated workflows that balance flexibility with guardrails
  • Strong understanding of developer personas: data scientists analysts engineers
  • Ability to iterate rapidly with early adopters and incorporate feedback without over-engineering
  • Experience with workspace security: secrets management network policies image scanning
  • Comfort operating at startup velocity within enterprise constraints

Leadership & Communication

  • Proven ability to work autonomously and drive technical decisions in ambiguous greenfield environments
  • Strong bias toward action: prototype quickly gather feedback iterate aggressively
  • Strong technical writing and documentation skills for developer-facing content
  • Excellent collaboration skills across engineering DevOps and data teams
  • Ability to establish technical foundations for new products with long-term vision while delivering short-term results

Nice to Have

  • Open-source contributions Jupyter ecosystem or developer tooling
  • Experience with Argo Workflows Tekton or Kubernetes-native CI/CD systems
  • Familiarity with data validation frameworks (Great Expectations dbt tests etc.)
  • Experience with Apache Iceberg or lakehouse architectures
  • Conference speaking or technical blogging on developer platforms or data tooling

GCS-23

 

 

 

For positions in this location we offer a base pay of $187600 - $328300 plus equity (when applicable) variable/incentive compensation and benefits. Sales positions generally offer a competitive On Target Earnings (OTE) incentive compensation structure. Please note that the base pay shown is a guideline and individual total compensation will vary based on factors such as qualifications skill level competencies and work location. We also offer health plans including flexible spending accounts a 401(k) Plan with company match ESPP matching donations a flexible time away plan and family leave programs. Compensation is based on the geographic location in which the role is located and is subject to change based on work location.


Additional Information :

Work Personas

We approach our distributed world of work with flexibility and trust. Work personas (flexible remote or required in office) are categories that are assigned to ServiceNow employees depending on the nature of their work and their assigned work location. Learn more here. To determine eligibility for a work persona ServiceNow may confirm the distance between your primary residence and the closest ServiceNow office using a third-party service.

Equal Opportunity Employer

ServiceNow is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race color creed religion sex sexual orientation national origin or nationality ancestry age disability gender identity or expression marital status veteran status or any other category protected by addition all qualified applicants with arrest or conviction records will be considered for employment in accordance with legal requirements. 

Accommodations

We strive to create an accessible and inclusive experience for all candidates. If you require a reasonable accommodation to complete any part of the application process or are unable to use this online application and need an alternative method to apply please contact for assistance. 

Export Control Regulations

For positions requiring access to controlled technology subject to export control regulations including the U.S. Export Administration Regulations (EAR) ServiceNow may be required to obtain export control approval from government authorities for certain individuals. All employment is contingent upon ServiceNow obtaining any export license or other approval that may be required by relevant export control authorities. 

From Fortune. 2025 Fortune Media IP Limited. All rights reserved. Used under license. 


Remote Work :

No


Employment Type :

Full-time

TeamJoin the Global Cloud Services organizations FinOps Tools team which is building ServiceNows next-generation analytics and financial governance platform. Our team owns the full modern data stack: Trino for distributed queries dbt for transformations Iceberg for lakehouse architecture Lightdash f...
View more view more

Key Skills

  • Spring
  • .NET
  • C/C++
  • Go
  • React
  • OOP
  • C#
  • AWS
  • Data Structures
  • Software Development
  • Java
  • Distributed Systems

About Company

Company Logo

Learn here. Grow here. Make a difference here. At ServiceNow, our cloud?based platform and solutions deliver digital workflows that create great experiences and unlock productivity for employees and enterprises. We’re growing fast, innovating even faster, and making an impact on our c ... View more

View Profile View Profile