Role Summary:
Agentic AI Platform Engineer: This role is a senior high-autonomy individual contributor responsible for architecting and deploying an end-to-end Intelligence Layer - spanning multi-agent orchestration (LangGraph) knowledge graph design (TigerGraph/OWL ontologies) and trusted data materialization (dbt/Snowflake). The engineer must bridge AI systems with enterprise data infrastructure building production-grade agents for both internal data ops automation and external governed insights. It demands deep expertise across agentic frameworks graph databases analytics engineering and AI-optimized CI/CD
Line Work Summary (Agentic Data Product Focus)
The AI @ Sales BPR initiative is actively building and deploying an agentic intelligence layer - anchored by Endor Insights an AI-powered conversational platform that autonomously queries analyzes and visualizes A Sales data using A-specific domain knowledge and custom logical frameworks. The program operates a structured PoC-to-production pipeline (via Wrike Tableau) supporting teams in ideating prototyping and scaling AI solutions from automated Wrike planning agents to Claude Code coding assistants. The knowledge layer is enriched with context-rich governed explainable insights - reflecting the hallmarks of a knowledge-graph-powered data product: domain awareness rule-based reasoning and structured intelligence delivery at scale.
We are seeking a Senior Agentic Platform Engineer to serve as the lead architect and implementer for our Intelligence Layer. This role is designed for a senior individual contributor who can operate with high autonomy to design develop and deploy production-ready solutions that optimize both internal data operations and external-facing intelligence.
You will be responsible for the end-to-end delivery of the connective tissue between our multi-agent architecture and our knowledge graph supporting the existing team by accelerating technical milestones and hardening the CI/CD processes required for reliable AI deployment.
Core Delivery Categories
Agentic Systems & Orchestration: Lead the build-out of a multi-agent architecture using LangGraph. You will design cyclic stateful workflows implement persistence and manage Human-in-the-Loop (HITL) checkpoints.
Graph-Native Intelligence: Autonomously design the Ontology-to-Schema pipeline mapping OWL/SKOS enterprise ontologies into TigerGraph. You will develop high-performance GSQL and architect multigraph memory systems for agentic reasoning.
Trusted Data Materialization: Own the materialization of data products in Snowflake using dbt specifically focusing on the DQ rules engine and automated Trust Score computation.
Internal & External Skill Development: Design and deploy agents tailored for Internal Data Ops (automating metadata harvest and DQ remediation) as well as External-Facing Skills that provide governed high-trust insights to end-users.
Technical Expertise & Experience Requirements
10 years of Senior Software & Data Engineering experience with a proven track record of production delivery within a global enterprise environment.
Advanced Agentic Orchestration (2 years): Deep hands-on mastery of LangGraph (StateGraph Command and Persistence) and LangChain.
Multi-LLM Mastery: Expert implementation of frontier models (including Anthropic Claude OpenAI GPT and Llama) and the Model Context Protocol (MCP) for standardized tool-calling and context injection across model providers.
TigerGraph & GSQL Specialist (5 years): Expert-level proficiency in GSQL development including writing distributed graph algorithms and optimizing complex sub-queries.
Knowledge Modeling: Direct experience modeling enterprise ontologies using OWL SKOS or RDF and successfully mapping them to Labeled Property Graph (LPG) schemas.
Analytics Engineering Mastery (5 years): Expert-level dbt (Core/Cloud) and Snowflake architecture with specific experience building automated Data Quality (DQ) monitors and trust-score pipelines.
Development Stack: High proficiency in Python (specifically Asynchronous programming FastAPI and Pydantic) and advanced SQL.
Internal Data Ops Optimization: Demonstrated experience building agents and skills specifically designed to automate Data Governance and Data Operations (e.g. automated glossary curation schema discovery and policy enforcement).
CI/CD DevOps & Process Optimization
Spec-Driven Development: Champion a Spec-First approach to AI development ensuring agent behaviors tool contracts and data schemas are defined via rigorous specifications (e.g. OpenAPI AsyncAPI or custom DSLs) before implementation.
AI-Optimized CI/CD: Support the team in designing and implementing robust CI/CD pipelines tailored for GenAI focusing on model-agnostic deployment patterns and high-frequency delivery cycles.
Process Engineering: Optimize team development workflows to support iterative AI loops including the implementation of specialized observability for agentic traces and automated feedback loops for data quality.
Preferred Experience
Unstructured Data & Vectors: Experience with unstructured data management and the implementation of vector databases (e.g. Pinecone Weaviate or Snowflake Cortex Search) within RAG architectures.
Enterprise Metadata Management: Hands-on experience with DataHub or similar data catalog and metadata management solutions to drive automated discovery.
Domain Expertise: Familiarity with Sales B2B and B2C data processes and associated tooling (e.g. Salesforce) including experience navigating CRM schemas for agentic tool-calling.
Governance & Security: Familiarity with data privacy and security frameworks (GDPR SOC2) as they apply to autonomous agents and Large Language Models.
Community Engagement: Contributions to open-source agentic frameworks or participation in the development of the Model Context Protocol (MCP) ecosystem.
Role Expectations for Contractors
Autonomous Execution: You are expected to take high-level architectural goals and drive them through to a deployed documented and production-tested state without daily supervision.
Team Support & Force Multiplication: Act as a technical anchor for the internal team removing blockers in the agent-graph interface and ensuring architectural consistency.
Stability & Observability: Your focus is on building resilient systems that are observable scalable and governed prioritizing long-term system health over simple prototyping.
Role Summary: Agentic AI Platform Engineer: This role is a senior high-autonomy individual contributor responsible for architecting and deploying an end-to-end Intelligence Layer - spanning multi-agent orchestration (LangGraph) knowledge graph design (TigerGraph/OWL ontologies) and trusted data m...
Role Summary:
Agentic AI Platform Engineer: This role is a senior high-autonomy individual contributor responsible for architecting and deploying an end-to-end Intelligence Layer - spanning multi-agent orchestration (LangGraph) knowledge graph design (TigerGraph/OWL ontologies) and trusted data materialization (dbt/Snowflake). The engineer must bridge AI systems with enterprise data infrastructure building production-grade agents for both internal data ops automation and external governed insights. It demands deep expertise across agentic frameworks graph databases analytics engineering and AI-optimized CI/CD
Line Work Summary (Agentic Data Product Focus)
The AI @ Sales BPR initiative is actively building and deploying an agentic intelligence layer - anchored by Endor Insights an AI-powered conversational platform that autonomously queries analyzes and visualizes A Sales data using A-specific domain knowledge and custom logical frameworks. The program operates a structured PoC-to-production pipeline (via Wrike Tableau) supporting teams in ideating prototyping and scaling AI solutions from automated Wrike planning agents to Claude Code coding assistants. The knowledge layer is enriched with context-rich governed explainable insights - reflecting the hallmarks of a knowledge-graph-powered data product: domain awareness rule-based reasoning and structured intelligence delivery at scale.
We are seeking a Senior Agentic Platform Engineer to serve as the lead architect and implementer for our Intelligence Layer. This role is designed for a senior individual contributor who can operate with high autonomy to design develop and deploy production-ready solutions that optimize both internal data operations and external-facing intelligence.
You will be responsible for the end-to-end delivery of the connective tissue between our multi-agent architecture and our knowledge graph supporting the existing team by accelerating technical milestones and hardening the CI/CD processes required for reliable AI deployment.
Core Delivery Categories
Agentic Systems & Orchestration: Lead the build-out of a multi-agent architecture using LangGraph. You will design cyclic stateful workflows implement persistence and manage Human-in-the-Loop (HITL) checkpoints.
Graph-Native Intelligence: Autonomously design the Ontology-to-Schema pipeline mapping OWL/SKOS enterprise ontologies into TigerGraph. You will develop high-performance GSQL and architect multigraph memory systems for agentic reasoning.
Trusted Data Materialization: Own the materialization of data products in Snowflake using dbt specifically focusing on the DQ rules engine and automated Trust Score computation.
Internal & External Skill Development: Design and deploy agents tailored for Internal Data Ops (automating metadata harvest and DQ remediation) as well as External-Facing Skills that provide governed high-trust insights to end-users.
Technical Expertise & Experience Requirements
10 years of Senior Software & Data Engineering experience with a proven track record of production delivery within a global enterprise environment.
Advanced Agentic Orchestration (2 years): Deep hands-on mastery of LangGraph (StateGraph Command and Persistence) and LangChain.
Multi-LLM Mastery: Expert implementation of frontier models (including Anthropic Claude OpenAI GPT and Llama) and the Model Context Protocol (MCP) for standardized tool-calling and context injection across model providers.
TigerGraph & GSQL Specialist (5 years): Expert-level proficiency in GSQL development including writing distributed graph algorithms and optimizing complex sub-queries.
Knowledge Modeling: Direct experience modeling enterprise ontologies using OWL SKOS or RDF and successfully mapping them to Labeled Property Graph (LPG) schemas.
Analytics Engineering Mastery (5 years): Expert-level dbt (Core/Cloud) and Snowflake architecture with specific experience building automated Data Quality (DQ) monitors and trust-score pipelines.
Development Stack: High proficiency in Python (specifically Asynchronous programming FastAPI and Pydantic) and advanced SQL.
Internal Data Ops Optimization: Demonstrated experience building agents and skills specifically designed to automate Data Governance and Data Operations (e.g. automated glossary curation schema discovery and policy enforcement).
CI/CD DevOps & Process Optimization
Spec-Driven Development: Champion a Spec-First approach to AI development ensuring agent behaviors tool contracts and data schemas are defined via rigorous specifications (e.g. OpenAPI AsyncAPI or custom DSLs) before implementation.
AI-Optimized CI/CD: Support the team in designing and implementing robust CI/CD pipelines tailored for GenAI focusing on model-agnostic deployment patterns and high-frequency delivery cycles.
Process Engineering: Optimize team development workflows to support iterative AI loops including the implementation of specialized observability for agentic traces and automated feedback loops for data quality.
Preferred Experience
Unstructured Data & Vectors: Experience with unstructured data management and the implementation of vector databases (e.g. Pinecone Weaviate or Snowflake Cortex Search) within RAG architectures.
Enterprise Metadata Management: Hands-on experience with DataHub or similar data catalog and metadata management solutions to drive automated discovery.
Domain Expertise: Familiarity with Sales B2B and B2C data processes and associated tooling (e.g. Salesforce) including experience navigating CRM schemas for agentic tool-calling.
Governance & Security: Familiarity with data privacy and security frameworks (GDPR SOC2) as they apply to autonomous agents and Large Language Models.
Community Engagement: Contributions to open-source agentic frameworks or participation in the development of the Model Context Protocol (MCP) ecosystem.
Role Expectations for Contractors
Autonomous Execution: You are expected to take high-level architectural goals and drive them through to a deployed documented and production-tested state without daily supervision.
Team Support & Force Multiplication: Act as a technical anchor for the internal team removing blockers in the agent-graph interface and ensuring architectural consistency.
Stability & Observability: Your focus is on building resilient systems that are observable scalable and governed prioritizing long-term system health over simple prototyping.
View more
View less