Benchmark Engineer AI Infrastructure / Vector Search
Location: Remote
Employment Type: Full-time
We are recruiting on behalf of a fast-growing AI infrastructure company that develops a high-performance open-source vector database used in semantic search recommendation systems and RAG pipelines.
We are seeking a Benchmark Engineer to design build and maintain performance benchmarks that help validate and improve large-scale AI and database systems.
This role sits at the intersection of engineering performance optimization and developer experience and offers an opportunity to work on core infrastructure used in modern AI applications.
About the Role
The Benchmark Engineer will own the benchmarking strategy ensuring performance testing reflects real-world workloads and produces reliable reproducible insights.
You will collaborate closely with engineering teams to identify bottlenecks validate improvements and communicate performance results clearly to both technical and non-technical audiences.
Key Responsibilities
- Design and maintain reproducible benchmarks for:
- Vector search workloads
- Indexing and filtering performance
- Distributed system scenarios
- Measure and analyze:
- Latency
- Throughput
- Recall and accuracy
- Memory usage and infrastructure cost
- Build and maintain benchmarking tools datasets and automation pipelines
- Compare performance against alternative solutions using fair and transparent methodologies
- Identify regressions bottlenecks and optimization opportunities
- Translate benchmark results into clear reports documentation or technical content
- Ensure testing reflects realistic production workloads
Requirements
- Strong software engineering background (Python Rust Go or similar)
- Understanding of:
- Databases or distributed systems
- Search engines or data infrastructure
- Experience with:
- Performance testing and benchmarking
- Profiling tools and performance analysis
- Automation pipelines and large datasets
- Ability to evaluate technical trade-offs such as:
- Speed vs accuracy
- Memory vs latency
- Strong communication skills and ability to present technical findings
Preferred Qualifications
- Experience with vector search or ANN algorithms
- Familiarity with machine learning infrastructure or AI systems
- Experience with cloud environments and containerized workloads
- Open-source project experience
- Knowledge of observability and monitoring tools
Whats Offered
- Opportunity to work on cutting-edge AI infrastructure
- Fully remote work environment
- Collaborative engineering-driven culture
- High technical ownership and impact
- Professional development opportunities
Interested
If this role sounds like a good fit please apply with your resume.
Full client details will be shared with qualified candidates before submission.
Benchmark Engineer AI Infrastructure / Vector Search Location: RemoteEmployment Type: Full-time We are recruiting on behalf of a fast-growing AI infrastructure company that develops a high-performance open-source vector database used in semantic search recommendation systems and RAG pipelines. We a...
Benchmark Engineer AI Infrastructure / Vector Search
Location: Remote
Employment Type: Full-time
We are recruiting on behalf of a fast-growing AI infrastructure company that develops a high-performance open-source vector database used in semantic search recommendation systems and RAG pipelines.
We are seeking a Benchmark Engineer to design build and maintain performance benchmarks that help validate and improve large-scale AI and database systems.
This role sits at the intersection of engineering performance optimization and developer experience and offers an opportunity to work on core infrastructure used in modern AI applications.
About the Role
The Benchmark Engineer will own the benchmarking strategy ensuring performance testing reflects real-world workloads and produces reliable reproducible insights.
You will collaborate closely with engineering teams to identify bottlenecks validate improvements and communicate performance results clearly to both technical and non-technical audiences.
Key Responsibilities
- Design and maintain reproducible benchmarks for:
- Vector search workloads
- Indexing and filtering performance
- Distributed system scenarios
- Measure and analyze:
- Latency
- Throughput
- Recall and accuracy
- Memory usage and infrastructure cost
- Build and maintain benchmarking tools datasets and automation pipelines
- Compare performance against alternative solutions using fair and transparent methodologies
- Identify regressions bottlenecks and optimization opportunities
- Translate benchmark results into clear reports documentation or technical content
- Ensure testing reflects realistic production workloads
Requirements
- Strong software engineering background (Python Rust Go or similar)
- Understanding of:
- Databases or distributed systems
- Search engines or data infrastructure
- Experience with:
- Performance testing and benchmarking
- Profiling tools and performance analysis
- Automation pipelines and large datasets
- Ability to evaluate technical trade-offs such as:
- Speed vs accuracy
- Memory vs latency
- Strong communication skills and ability to present technical findings
Preferred Qualifications
- Experience with vector search or ANN algorithms
- Familiarity with machine learning infrastructure or AI systems
- Experience with cloud environments and containerized workloads
- Open-source project experience
- Knowledge of observability and monitoring tools
Whats Offered
- Opportunity to work on cutting-edge AI infrastructure
- Fully remote work environment
- Collaborative engineering-driven culture
- High technical ownership and impact
- Professional development opportunities
Interested
If this role sounds like a good fit please apply with your resume.
Full client details will be shared with qualified candidates before submission.
View more
View less