Data Architect, Data Center Digital Infrastructure

Not Interested
Bookmark
Report This Job

profile Job Location:

Marshall County, WV - USA

profile Monthly Salary: Not Disclosed
Posted on: 2 hours ago
Vacancies: 1 Vacancy

Job Summary

Data Architect Data Center Digital Infrastructure
Location: Dallas TX

Overview

This organization is backed by dedicated leadership and investment with a clear mission as it operates at the bleeding edge of technology. Its goal is to scale and enhance high-performance computing (HPC) and cloud infrastructure that supports clients research production and delivery enabling breakthroughs that shape the industries of tomorrow. Its engineers build critical infrastructure to eliminate friction in scientific research simulations analysis and decision-making accelerating discovery and driving faster innovation.

We are seeking an experienced Data Architect to own the data strategy architecture and governance of the data platform underpinning a next-generation data center which serves as the foundation for broader compute cloud and digital application offerings. This role is key in structuring securing and optimizing data flows from OT/IT systems facility and environmental sensors compute platforms cloud services developer and application platforms and business applications into a unified analytics- and AI-ready data environment. Working closely with digital and platform architecture partners this role ensures a cohesive digital and data ecosystem across physical infrastructure and higher-level platform capabilities.

A core focus of this role is the development of enterprise semantic models data ontologies and data anthologies that establish the authoritative layer of meaning across the organizations environments. These semantic foundations power digital twins operational and platform intelligence reusable data products and simulation-driven workflows. The role defines how data flows transforms and connects across data center infrastructure HPC clusters GPU workloads job schedulers observability systems performance and environmental telemetry and application-facing platforms enabling consistent measurement optimization and planning across the full compute stack.

Key Responsibilities

Data Strategy & Architecture
- Define the enterprise data strategy and target-state architecture for data center and operational domains ensuring alignment with analytics AI digital twin and simulation use cases.
- Architect scalable data models and data flows supporting HPC clusters GPU workloads job schedulers observability systems and data center infrastructure.
- Design and maintain unified data architectures that connect compute telemetry environmental data power and cooling systems network performance metrics and enterprise datasets.
- Define standards for data ingestion transformation API integration and schema management to ensure interoperability across distributed systems.

Ontology & Semantic Modeling
- Define and govern the enterprise data ontology and data anthology establishing the authoritative semantic layer that unifies meaning relationships and context across data center and business domains.
- Lead the development of semantic models and ontologies (e.g. Palantir Foundry Ontology) to ensure consistent representation of data workflows assets operational processes and digital systems.
- Design ontology-enabled data abstractions that support digital twins simulations scenario modeling and predictive analytics across compute and infrastructure environments.
- Define canonical data models semantic contracts and interoperability standards that unify heterogeneous OT IT HPC and business data sources.

Data Governance & Mesh
- Establish data domain boundaries and ownership models aligned with data mesh principles enabling decentralized data ownership with shared semantics and federated governance.
- Establish governance frameworks for data quality lineage metadata access controls lifecycle management and compliance balancing control with domain autonomy.
- Enable a data mesh operating model by defining standards for discoverability data products semantic consistency and federated governance.
- Implement and guide adoption of data cataloging metadata management and semantic discovery tools to enable self-service analytics and operational transparency.

Collaboration & Innovation
- Partner with Big Data data center architecture and product teams to translate platform and product requirements into scalable governed data and semantic architectures.
- Evaluate and integrate emerging technologies in data management knowledge graphs observability simulation tooling and HPC analytics.

Required Experience

- 7 years of experience defining enterprise data architecture and data strategy including ownership of semantic models governance frameworks and cross-domain data integration in complex large-scale environments.
- Strong expertise in semantic modeling data ontology design and data anthology development with the ability to translate complex technical and operational domains into governed semantic layers.
- Demonstrated experience designing or governing enterprise data anthologies including ontologies canonical entities semantic relationships metadata and business meaning across domains.
- Experience building or governing enterprise ontologies or semantic data layers using platforms such as Palantir Foundry Ontology or enterprise knowledge graphs.
- Experience designing data mesh-aligned architectures including domain-oriented data products federated governance and shared semantic standards.
- Deep understanding of conceptual logical and physical data modeling taxonomy design reference data management and canonical data models.
- Ability to architect digital twin and simulation data models connecting physical infrastructure compute systems operational workflows and predictive modeling outputs.
- Solid understanding of HPC and distributed compute environments including GPU workloads job schedulers (e.g. SLURM Kubernetes) telemetry and operational metadata.
- Familiarity with cloud-scale analytics and data platforms (e.g. Snowflake Databricks BigQuery) and cloud ecosystems (AWS GCP Azure) from an architectural and evaluation perspective rather than a data engineering role.
- Experience defining interoperability standards semantic APIs schema governance and cross-system integration patterns.
- Strong grounding in data governance including metadata lineage access control (RBAC/ABAC) lifecycle management and compliance.
- Systems-thinking mindset with the ability to reason across physical data center environments HPC compute ecosystems cloud platforms and analytical layers.
- Excellent communication and cross-functional leadership skills able to align engineering operations platform and product teams around a shared data and semantic strategy.

Data Architect Data Center Digital InfrastructureLocation: Dallas TXOverviewThis organization is backed by dedicated leadership and investment with a clear mission as it operates at the bleeding edge of technology. Its goal is to scale and enhance high-performance computing (HPC) and cloud infrastru...
View more view more