Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailLine of Service
AdvisoryIndustry/Sector
Not ApplicableSpecialism
OperationsManagement Level
Senior AssociateJob Description & Summary
At PwC our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals.Why PWC
At PwC you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes forour clients and communities. This purpose-led and values-driven work powered by technology in an environment that drives innovation will enable you to make a tangible impact in the real world. We reward your contributions support your wellbeing and offer inclusive benefits flexibility programmes and mentorship that will help you thrive in work and life. Together we grow learn care collaborate and create a future of infinite experiences foreach other. Learn more about us.
At PwC we believe in providing equal employment opportunities without any discrimination on the grounds of gender ethnic background age disability marital status sexual orientation pregnancy gender identity or expression religion or other beliefs perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firms growth. To enable this we have zero tolerance for any discrimination and harassment based on the above considerations.
Job Description & Summary: We are seeking an experienced Cloud AI/ML/GenAI QA Engineer with specialized expertise in load testing and performance validation of AI-powered systems. This role combines traditional QA expertise with deep knowledge in AI/ML testing generative AI validation and advanced performance testing strategies for high-scale AI applications. The ideal candidate will have 4-9 years of experience in quality assurance performance testing and AI/ML systems with proven ability to test AI applications handling thousands of concurrent users.
Responsibilities:
Design load testing strategies for ML inference endpoints with varying model complexities Test model performance degradation under concurrent requests and high throughput scenarios Validate batch processing performance for large-scale ML workloads Implement automated performance regression testing for ML model deployments Conduct GPU utilization and resource optimization testing for AI workloads Test model serving platforms (TensorFlow Serving PyTorch Serve SageMaker) under load Execute high-scale load testing for LLM inference endpoints and token generation performance Test conversational AI systems with thousands of simultaneous multi-turn dialogues Validate streaming response performance and real-time AI interactions under stress Perform capacity planning for generative AI workloads and cost optimization Test RAG systems and vector database performance under concurrent query loads Validate fine-tuned model performance and custom AI workflow scalability Design and execute load tests for AI applications handling 50K concurrent users Implement distributed load testing across multiple cloud regions
Conduct stress testing spike testing and endurance testing for AI systems Perform bottleneck analysis and performance optimization recommendations Test auto-scaling behavior and failover scenarios for AI infrastructure Validate disaster recovery performance and system resilience under load Execute performance testing for AWS/Azure/GCP-based AI/ML applications Test serverless AI applications for cold start performance and concurrency limits Validate container orchestration performance for ML workloads on Kubernetes Conduct database performance testing including vector databases under load Test CDN performance and edge caching for AI-powered applications Implement monitoring integration with load testing for comprehensive performance analysis Build enterprise-scale automated load testing frameworks and custom solutions Integrate performance testing gates into CI/CD pipelines Create performance monitoring dashboards and alerting systems Develop load testing infrastructure using Infrastructure as Code Implement chaos engineering and fault injection testing Design performance benchmarking and capacity modeling frameworks
Mandatory skill sets:
Load Testing & Performance Engineering: Load Testing Tools (JMeter/LoadRunner/K6/BlazeMeter - Advanced) Performance Metrics Analysis Capacity Planning
AI/ML Performance Testing: ML Model Inference Testing LLM Performance Validation GPU Workload Testing Model Serving Platform Testing Cloud Performance Testing: AWS/Azure/GCP Load Testing Services Serverless Performance Testing Auto-scaling Validation Programming & Automation: Python (Advanced) JavaScript SQL Performance Test Script Development Performance Analysis & Monitoring: APM Tools (New Relic/Datadog/AppDynamics) Performance Bottleneck Analysis Resource Utilization Monitoring Enterprise Testing Experience: High-Scale Testing (10K concurrent users) Distributed Load Testing Performance Benchmarking
Preferred skill sets:
Advanced Load Testing Tools: Gatling Artillery Custom Load Testing Framework Development Cloud-native Testing Platforms Specialized AI Performance Testing: Vector Database Performance Testing Embedding Search Optimization Multi-modal AI Load Testing Edge AI Performance Advanced Performance Engineering: Chaos Engineering (Chaos Monkey/Gremlin) Fault Injection Testing Performance Modeling Statistical Analysis Cloud & Infrastructure Testing: Kubernetes Load Testing Container Performance Testing Multi-cloud Performance Validation CDN Testing Data Engineering Performance: ETL/ELT Pipeline Performance Testing Stream Processing Load Testing Big Data Performance Validation Advanced Monitoring & Observability: Prometheus/Grafana ELK Stack Custom Performance Dashboards Real-time Performance Analytics DevOps & Infrastructure: Infrastructure as Code Testing CI/CD Performance Gates Container Orchestration Testing Enterprise & Security: Enterprise-scale Testing (100K users) Security Performance Testing Compliance Testing Network Performance
Years of experience required: 4-9 Years
Education qualification:
B. E/ B. Tech
Education (if blank degree and/or field of study not specified)
Degrees/Field of Study required: MBA (Master of Business Administration) Bachelor of EngineeringDegrees/Field of Study preferred:Certifications (if blank certifications not specified)
Required Skills
AI Programming Machine LearningOptional Skills
Accepting Feedback Accepting Feedback Active Listening Analytical Thinking Business Case Development Business Data Analytics Business Intelligence and Reporting Tools (BIRT) Business Intelligence Development Studio Communication Competitive Advantage Continuous Process Improvement Creativity Data Analysis and Interpretation Data Architecture Database Management System (DBMS) Data Collection Data Pipeline Data Quality Data Science Data Visualization Embracing Change Emotional Regulation Empathy Inclusion Industry Trend Analysis 16 moreDesired Languages (If blank desired languages not specified)
Travel Requirements
Not SpecifiedAvailable for Work Visa Sponsorship
NoGovernment Clearance Required
NoJob Posting End Date
Required Experience:
Senior IC
Full-Time