Were building foundational infrastructure to secure AI agents including their identities access patterns and interactions with sensitive systems and data. This includes designing intelligent dynamic mechanisms for ephemeral access control secrets management and agent/user identity tailored to modern agent frameworks such as LangChain LangGraph Semantic Kernel AutoGen and beyond.
In your role you will be responsible for developing and maintaining high-quality work procedures executing both manual and automated tests and leading the team to achieve better quality.
This is a rare opportunity to help shape a new security infrastructure layer from the ground up in a team with the agility of a startup and the backing of a mature organization. Youll play a pivotal role in shaping how AI agents operate safely and effectively across real-world enterprise environments.
Responsibilities:
- Define the overall testing strategy and tooling stack across integration regression and load/performance testing.
- Drive an automation-first mindset collaborate with developers to ensure code is testable and observable.
- Partner with product and security to define quality gates for features like agent identity access delegation and secrets handling.
- Lead root-cause analysis for escaped bugs and define improvement actions.
- Own CI quality metrics and dashboards (e.g. build stability test flakiness regression coverage).
- Influence decisions about test environments mocks test data and production parity.
- Collaborate with the centralized quality group to push for E2E coverage
#LI-Hybrid
#LI-MS1
Qualifications :
- At least 5 years of experience in QA positions
- Experience building QA practices for cloud-native distributed systems
- Deep expertise in defining quality strategy in early-stage teams or 01 products
- Hands-on experience with CI/CD pipelines (e.g. GitHub Actions CircleCI or similar)
- Experience testing Go-based systems or equivalent backend APIs including performance and concurrency validation
- Familiarity with AWS services including API Gateway Lambda DynamoDB CloudWatch
- Knowledge of observability tools like OpenTelemetry Datadog - Advantage
- Background or experience in software development and/or security testing - Advantage.
- Knowledge of security principles (e.g. access control secrets exposure identity testing) a strong Advantage
- Experience in building automated testing gates in CI/CD-based products
Remote Work :
No
Employment Type :
Full-time
Were building foundational infrastructure to secure AI agents including their identities access patterns and interactions with sensitive systems and data. This includes designing intelligent dynamic mechanisms for ephemeral access control secrets management and agent/user identity tailored to moder...
Were building foundational infrastructure to secure AI agents including their identities access patterns and interactions with sensitive systems and data. This includes designing intelligent dynamic mechanisms for ephemeral access control secrets management and agent/user identity tailored to modern agent frameworks such as LangChain LangGraph Semantic Kernel AutoGen and beyond.
In your role you will be responsible for developing and maintaining high-quality work procedures executing both manual and automated tests and leading the team to achieve better quality.
This is a rare opportunity to help shape a new security infrastructure layer from the ground up in a team with the agility of a startup and the backing of a mature organization. Youll play a pivotal role in shaping how AI agents operate safely and effectively across real-world enterprise environments.
Responsibilities:
- Define the overall testing strategy and tooling stack across integration regression and load/performance testing.
- Drive an automation-first mindset collaborate with developers to ensure code is testable and observable.
- Partner with product and security to define quality gates for features like agent identity access delegation and secrets handling.
- Lead root-cause analysis for escaped bugs and define improvement actions.
- Own CI quality metrics and dashboards (e.g. build stability test flakiness regression coverage).
- Influence decisions about test environments mocks test data and production parity.
- Collaborate with the centralized quality group to push for E2E coverage
#LI-Hybrid
#LI-MS1
Qualifications :
- At least 5 years of experience in QA positions
- Experience building QA practices for cloud-native distributed systems
- Deep expertise in defining quality strategy in early-stage teams or 01 products
- Hands-on experience with CI/CD pipelines (e.g. GitHub Actions CircleCI or similar)
- Experience testing Go-based systems or equivalent backend APIs including performance and concurrency validation
- Familiarity with AWS services including API Gateway Lambda DynamoDB CloudWatch
- Knowledge of observability tools like OpenTelemetry Datadog - Advantage
- Background or experience in software development and/or security testing - Advantage.
- Knowledge of security principles (e.g. access control secrets exposure identity testing) a strong Advantage
- Experience in building automated testing gates in CI/CD-based products
Remote Work :
No
Employment Type :
Full-time
View more
View less