Job Summary: This position is for an Onshore Test Lead to support client Insight suite of products. This Test Lead will be responsible for leading Insight products related testing efforts which include gathering requirements test case creation test execution defect logging and retest coordinating with various stakeholders including technical teams business teams product owners project managers external vendors and offshore teams.
This is a data centric product and we are looking for a candidate who is a data enthusiast and who has a zeal to work with data analyze and understand various metrics being derived out of data. Experience with testing AI prompting tools is a must.
Essential Job Functions:
- Testing various dashboards and certifying the metrics on these dashboard are correct after comparing with the backend underlying data.
- Testing prompt based AI tools to make sure the prompts are returning the right values back to the UI.
- Focus on testing to verify the accuracy of this data as various business rules are applied.
- Understand the data flow validate the content on UI screens understand and test the business rules involved with data transformation and data aggregation.
- Create and execute scenarios to test various API. Preparing request block and analyzing the responses in JSON/XML formats.
- Validate the flow of data from disparate sources ingested into multiple databases inside Databricks post which data is transformed by pipelines and workflows built within Azure Databricks and Azure Data Factory (ETL process).
- Thoroughly test the ETL rules built for data transformation & complex business rule built for data aggregation.
- Strong in SQL skills. Should possess ability to understand and write complex queries.
- Execute tests using SQL or Python or PySpark as per the user stories to validate the data inside various databases within the Databricks environment.
- Test different source and target tables available in Azure Databricks that are sourced cleansed transformed joined aggregated and final data sent to downstream applications.
- Automate recurring QA processes through the use of advanced languages such as Python or PySpark or Java as needed.
- Design and build out an automation framework using PyTest to validate different scenarios and its data. This includes both automating new tests and/or updating existing scripts.
- Previously should have exposure to code repository tools creating branches pull requests and perform code merge activities.
- Previously should have exposure to SonarQube and main code quality fix code smells etc.
- Create and execute detailed manual test cases from time to time using functional requirements and technical specifications within Jira to ensure quality and accuracy.
- Log appropriate defects within Jira when product does not conform to specifications.
- Participate in daily stand-ups with project team as part of the agile methodology.
- Coordinate with development team members regarding defect validation and assist development team members with re-creating defects.
- Create appropriate test cases within TestRail Test Management tool.
- Update tasks information in Jira as appropriate to communicate progress with onshore test lead.
Minimum Qualifications and Job Requirements:
- 3 years strong experience in writing complex SQL queries.
- 3 years of experience in building test automation for data processing within data intensive projects.
- 3 years of experience in Python data management programming or PySpark experience is a must.
- 2 years working with Apache Delta Lake or Databricks Azure Databricks preferred
- 1 year experience in testing AI Chat / Prompting Tools.
- Experience with code repository tools creating branch pull requests and perform code merges.
- Good understanding of file formats including JSON Parquet Avro and others
- Ability to learn new technologies quickly
- Excellent problem-solving skills
- Working understanding of clean code software development principles.
- Knowledge of Jira
- Ability to handle multiple tasks/projects concurrently and meet deadlines.
- Ability to work in a fast-paced team environment. Expectations include a high level of initiative and a strong commitment to job knowledge productivity and attention to detail
- Solid software engineering skills - participated in full lifecycle development on large projects.
Other Responsibilities:
- Maintain technology expertise keeping current with evolving testing tools techniques and strategies to improve the overall testing efficiency processes and best practices.
- Maintain a focus on customer-service efficiency quality and growth.
- Safeguard the companys assets.
- Adhere to the companys compliance program.
- Maintain comprehensive knowledge of industry standards methodologies processes and best practices.
Job Summary: This position is for an Onshore Test Lead to support client Insight suite of products. This Test Lead will be responsible for leading Insight products related testing efforts which include gathering requirements test case creation test execution defect logging and retest coord...
Job Summary: This position is for an Onshore Test Lead to support client Insight suite of products. This Test Lead will be responsible for leading Insight products related testing efforts which include gathering requirements test case creation test execution defect logging and retest coordinating with various stakeholders including technical teams business teams product owners project managers external vendors and offshore teams.
This is a data centric product and we are looking for a candidate who is a data enthusiast and who has a zeal to work with data analyze and understand various metrics being derived out of data. Experience with testing AI prompting tools is a must.
Essential Job Functions:
- Testing various dashboards and certifying the metrics on these dashboard are correct after comparing with the backend underlying data.
- Testing prompt based AI tools to make sure the prompts are returning the right values back to the UI.
- Focus on testing to verify the accuracy of this data as various business rules are applied.
- Understand the data flow validate the content on UI screens understand and test the business rules involved with data transformation and data aggregation.
- Create and execute scenarios to test various API. Preparing request block and analyzing the responses in JSON/XML formats.
- Validate the flow of data from disparate sources ingested into multiple databases inside Databricks post which data is transformed by pipelines and workflows built within Azure Databricks and Azure Data Factory (ETL process).
- Thoroughly test the ETL rules built for data transformation & complex business rule built for data aggregation.
- Strong in SQL skills. Should possess ability to understand and write complex queries.
- Execute tests using SQL or Python or PySpark as per the user stories to validate the data inside various databases within the Databricks environment.
- Test different source and target tables available in Azure Databricks that are sourced cleansed transformed joined aggregated and final data sent to downstream applications.
- Automate recurring QA processes through the use of advanced languages such as Python or PySpark or Java as needed.
- Design and build out an automation framework using PyTest to validate different scenarios and its data. This includes both automating new tests and/or updating existing scripts.
- Previously should have exposure to code repository tools creating branches pull requests and perform code merge activities.
- Previously should have exposure to SonarQube and main code quality fix code smells etc.
- Create and execute detailed manual test cases from time to time using functional requirements and technical specifications within Jira to ensure quality and accuracy.
- Log appropriate defects within Jira when product does not conform to specifications.
- Participate in daily stand-ups with project team as part of the agile methodology.
- Coordinate with development team members regarding defect validation and assist development team members with re-creating defects.
- Create appropriate test cases within TestRail Test Management tool.
- Update tasks information in Jira as appropriate to communicate progress with onshore test lead.
Minimum Qualifications and Job Requirements:
- 3 years strong experience in writing complex SQL queries.
- 3 years of experience in building test automation for data processing within data intensive projects.
- 3 years of experience in Python data management programming or PySpark experience is a must.
- 2 years working with Apache Delta Lake or Databricks Azure Databricks preferred
- 1 year experience in testing AI Chat / Prompting Tools.
- Experience with code repository tools creating branch pull requests and perform code merges.
- Good understanding of file formats including JSON Parquet Avro and others
- Ability to learn new technologies quickly
- Excellent problem-solving skills
- Working understanding of clean code software development principles.
- Knowledge of Jira
- Ability to handle multiple tasks/projects concurrently and meet deadlines.
- Ability to work in a fast-paced team environment. Expectations include a high level of initiative and a strong commitment to job knowledge productivity and attention to detail
- Solid software engineering skills - participated in full lifecycle development on large projects.
Other Responsibilities:
- Maintain technology expertise keeping current with evolving testing tools techniques and strategies to improve the overall testing efficiency processes and best practices.
- Maintain a focus on customer-service efficiency quality and growth.
- Safeguard the companys assets.
- Adhere to the companys compliance program.
- Maintain comprehensive knowledge of industry standards methodologies processes and best practices.
View more
View less