Staff Software Developer FinOps Cloud Development Platform

ServiceNow

Not Interested
Bookmark
Report This Job

profile Job Location:

Stamford, CT - USA

profile Monthly Salary: Not Disclosed
Posted on: 8 hours ago
Vacancies: 1 Vacancy

Job Summary

Team

Join the Global Cloud Services organizations FinOps Tools team which is building ServiceNows next-generation analytics and financial governance platform. Our team owns the full modern data stack: Trino for distributed queries dbt for transformations Iceberg for lakehouse architecture Lightdash for business intelligence and Argo Workflows for orchestration. You will be the founding engineer dedicated to building the Cloud Development Platform that empowers our 30 data practitioners (data scientists analysts and FinOps engineers) to collaborate and productionize analytics at scale.

Role

We are building a cloud-native data development platform that bridges the gap between exploratory analysis and production-grade workflows. As our founding Staff Software Developer focused on Cloud Development Infrastructure you will design architect and rapidly implement a platform built on VS Code Coder and Jupyter that seamlessly integrates with our existing data stack (Trino dbt Iceberg Lightdash Argo Workflows).

You will establish opinionated automated pathways from notebook experimentation to production pipelines moving at startup speed within an enterprise environment. This role demands aggressive execution: working prototype in 3 months production-ready platform in 6 months.

This is a unique opportunity to build from the ground up and define how data development happens at ServiceNows scale.

What Youll Do Core Responsibilities

  • Design and develop scalable maintainable and reusable software components with a strong emphasis on performance and reliability.
  • Collaborate with product managers to translate requirements into well-architected solutions owning features from design through delivery
  • Build intuitive and extensible user experiences using modern UI frameworks ensuring flexibility for customer-specific needs.
  • Contribute to the design and implementation of new products and features while enhancing existing product capabilities.
  • Integrate automated testing into development workflows to ensure consistent quality across releases.
  • Participate in design and code reviews ensuring best practices in performance maintainability and testability.
  • Develop comprehensive test strategies covering functional regression integration and performance aspects
  • Foster a culture of continuous learning and improvement by sharing best practices in engineering and quality
  • Promote a culture of engineering craftsmanship knowledge-sharing and thoughtful quality practices across the team.

Technical Leadership & Architecture

  • Design and architect the foundational cloud development platform for notebook-based data workflows
  • Lead technical decision-making on workspace provisioning developer experience and productionization pathways
  • Establish best practices for notebook-to-production workflows including git integration parameterization validation and automated deployment
  • Drive innovation in data development platforms leveraging AI/ML tools for enhanced developer productivity
  • Move fast: deliver working MVP in 3 months production system scale in 6 months

Hands-On Development

  • Build and customize cloud workspace infrastructure using Coder (open source) on Kubernetes
  • Develop VS Code extensions (TypeScript) for productionization workflows: notebook validation parameterization and Argo Workflow generation
  • Implement opinionated notebook templates and validation rules for production-ready data pipelines
  • Create seamless integrations between notebooks and ServiceNows data stack: Trino queries Iceberg table outputs Lightdash previews dbt transformations
  • Build backend services (Python) for workflow orchestration notebook parsing and metadata management
  • Deploy JupyterHub initially then progressively replace components with custom platform features based on user feedback

Platform Foundation

  • Design container images with embedded security policies pre-configured data access to Trino/Iceberg tables and optimized dependencies
  • Implement git-native workflows with automated notebook versioning code review integration and CI/CD pipelines
  • Build observability and monitoring for workspace health user activity and pipeline success rates
  • Establish infrastructure foundation that scales from 5 early adopters to 30 practitioners within first year

Developer Experience & Automation

  • Create template-based notebook workflows with opinionated structure: parameterization (Papermill-style) Iceberg table outputs validation checkpoints
  • Build CLI and UI tooling for one-click productionization: notebook Argo Workflow with minimal manual intervention
  • Establish developer guardrails: credential management data access policies resource quotas
  • Collaborate closely with early adopter data scientists to rapidly iterate on workflows and validate usability
  • Prioritize platform stability and clear productionization paths over feature breadth in first 6 months

AI-Driven Development

  • Leverage cutting-edge AI development tools (e.g.. Cursor Windsurf ChatGPT GitHub Copilot) to accelerate development velocity
  • Establish AI-augmented development practices and mentor future team members on effective AI tool utilization
  • Drive innovation in AI-assisted code generation testing and platform optimization

Collaboration & Integration

  • Work autonomously with guidance from Engineering and FinOps leadership
  • Collaborate with DevOps team on Kubernetes infrastructure CI/CD pipelines and security policies
  • Partner with FinOps Tools team members working on Trino dbt Lightdash and Iceberg to ensure seamless integrations
  • Contribute to open-source projects in the notebook and developer tooling ecosystem

Qualifications :

Qualifications Required Experience

  • Experience in leveraging or critically thinking about how to integrate AI into work processes decision-making or problem-solving. This may include using AI-powered tools automating workflows analyzing AI-driven insights or exploring AIs potential impact on the function or industry.
  • 8 years of experience in software engineering with a track record of delivering high-quality products with deep expertise in full-stack development and cloud-native architecture with a Bachelors degree; or 6 years and a Masters degree; or a PhD with 3 years experience in Computer Science Engineering or related technical field; or equivalent experience.
  • Strong Python skills for backend services API development and data tooling (notebook parsing workflow generation)
  • Proven track record of rapid execution in greenfield environments with evolving requirements
  • Hands-on experience building and scaling developer platforms or internal tools at enterprise scale
  • Deep understanding of cloud development environments (Coder GitHub Codespaces Gitpod or similar)
  • Strong Kubernetes and containerization expertise for cloud-native application deployment
  • Experience with data workflows and tooling: Jupyter notebooks orchestration systems (Airflow/Argo) data catalogs
  • Full professional proficiency in English
  • Proficiency in Python Java or similar object-oriented languages.
  • Experience with modern front-end frameworks such as Angular React or Vue.
  • Strong knowledge of data structures algorithms object-oriented design design patterns and performance optimization
  • Familiarity with automated testing frameworks (e.g. JUnit Selenium TestNG) and integrating tests into CI/CD pipelines
  • Understanding software quality principles including reliability observability and production readiness.
  • Ability to troubleshoot complex systems and optimize performance across the stack.
  • Experience with AI-powered tools or workflows including validation of datasets model predictions and inference consistency.
  • Comfort with development tools such as IDEs debuggers profilers source control and Unix-based systems

Technical Expertise

  • VS Code ecosystem: Extension API webview development command palette language servers debugger protocols
  • Coder or similar platforms: Workspace provisioning remote development environments infrastructure customization
  • Jupyter ecosystem: JupyterHub Jupyter Server Papermill nbconvert or similar notebook tooling
  • Kubernetes & containerization: Pod management custom resource definitions Helm charts image security
  • Infrastructure as Code: Terraform Kubernetes operators GitOps workflows
  • Git workflows: Branching strategies code review automation CI/CD integration
  • Modern data stack: Familiarity with Trino dbt Iceberg Argo Workflows or similar technologies
  • API design: RESTful services authentication (OAuth/SAML) webhook integrations

Platform Engineering & Developer Experience

  • Proven track record building internal developer platforms or productivity tools from scratch
  • Experience designing opinionated workflows that balance flexibility with guardrails
  • Strong understanding of developer personas: data scientists analysts engineers
  • Ability to iterate rapidly with early adopters and incorporate feedback without over-engineering
  • Experience with workspace security: secrets management network policies image scanning
  • Comfort operating at startup velocity within enterprise constraints

Leadership & Communication

  • Proven ability to work autonomously and drive technical decisions in ambiguous greenfield environments
  • Strong bias toward action: prototype quickly gather feedback iterate aggressively
  • Strong technical writing and documentation skills for developer-facing content
  • Excellent collaboration skills across engineering DevOps and data teams
  • Ability to establish technical foundations for new products with long-term vision while delivering short-term results

Nice to Have

  • Open-source contributions Jupyter ecosystem or developer tooling
  • Experience with Argo Workflows Tekton or Kubernetes-native CI/CD systems
  • Familiarity with data validation frameworks (Great Expectations dbt tests etc.)
  • Experience with Apache Iceberg or lakehouse architectures
  • Conference speaking or technical blogging on developer platforms or data tooling

Why Join Us

  • Build and deliver high-impact software that powers digital experiences for millions of users.
  • Collaborate in a culture that values craftsmanship quality and innovation.
  • Work symbiotically with AI and automation tools that enhance engineering excellence and drive product reliability.
  • Be part of a culture that encourages innovation continuous learning and shared success.

GCS-23


Additional Information :

Work Personas

We approach our distributed world of work with flexibility and trust. Work personas (flexible remote or required in office) are categories that are assigned to ServiceNow employees depending on the nature of their work and their assigned work location. Learn more here. To determine eligibility for a work persona ServiceNow may confirm the distance between your primary residence and the closest ServiceNow office using a third-party service.

Equal Opportunity Employer

ServiceNow is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race color creed religion sex sexual orientation national origin or nationality ancestry age disability gender identity or expression marital status veteran status or any other category protected by addition all qualified applicants with arrest or conviction records will be considered for employment in accordance with legal requirements. 

Accommodations

We strive to create an accessible and inclusive experience for all candidates. If you require a reasonable accommodation to complete any part of the application process or are unable to use this online application and need an alternative method to apply please contact for assistance. 

Export Control Regulations

For positions requiring access to controlled technology subject to export control regulations including the U.S. Export Administration Regulations (EAR) ServiceNow may be required to obtain export control approval from government authorities for certain individuals. All employment is contingent upon ServiceNow obtaining any export license or other approval that may be required by relevant export control authorities. 

From Fortune. 2025 Fortune Media IP Limited. All rights reserved. Used under license. 


Remote Work :

Yes


Employment Type :

Full-time

TeamJoin the Global Cloud Services organizations FinOps Tools team which is building ServiceNows next-generation analytics and financial governance platform. Our team owns the full modern data stack: Trino for distributed queries dbt for transformations Iceberg for lakehouse architecture Lightdash f...
View more view more

Key Skills

  • Account Payable
  • C++
  • Community Support
  • Garment
  • Import & Export
  • Java

About Company

Company Logo

Learn here. Grow here. Make a difference here. At ServiceNow, our cloud?based platform and solutions deliver digital workflows that create great experiences and unlock productivity for employees and enterprises. We’re growing fast, innovating even faster, and making an impact on our c ... View more

View Profile View Profile