We are seeking a highly motivated and skilled Quality Engineer to join our Data Engineering team in Hyderabad India. This individual will play a critical role in ensuring the accuracy reliability and integrity of our central data lake and reporting products. The ideal candidate will be passionate about data quality possess a strong understanding of data engineering principles and be adept at developing and implementing robust quality controls and testing strategies for data pipelines and reporting solutions.
Key Responsibilities
- Design and Execute Quality Strategy: Develop and implement comprehensive test strategies and quality control processes specifically for data ingestion transformation storage and consumption within the data lake and reporting ecosystem.
- Data Quality Testing: Create execute and maintain automated and manual data quality tests (including unit integration and end-to-end tests) to validate data accuracy completeness consistency and conformity with business rules.
- Pipeline Validation: Validate ETL/ELT data pipelines to ensure data is processed correctly and efficiently focusing on error handling performance and scalability.
- Reporting and Dashboard Verification: Thoroughly test and validate the data displayed in business intelligence (BI) reports and dashboards against source data to guarantee accuracy and reliability for business decision-making.
- Quality Gates & Monitoring: Implement quality gates within the CI/CD pipeline and establish data quality monitoring dashboards to proactively identify and alert on data anomalies and pipeline failures.
- Collaboration: Work closely with Data Engineers Data Scientists and Product Managers to understand data requirements define quality metrics and integrate quality assurance throughout the data development lifecycle.
- Documentation: Document test plans test cases and quality control processes as well as track and report on data quality metrics and issues.
Required Qualifications
- 5 years of experience in Quality Assurance or Quality Engineering with a focus on data-centric testing or data warehousing environments.
- Strong proficiency in SQL for data querying and validation.
- Experience in developing test automation frameworks and scripts for data pipelines using tools/languages like Python PHP or Javascript.
- Solid understanding of cloud-based data warehousing and data lake concepts (e.g. AWS S3/Glue Azure Data Lake/Synapse Google Cloud Storage/BigQuery).
- Familiarity with ETL/ELT tools and data pipeline orchestration (e.g. Apache Airflow dbt Talend).
- Experience with BI tools (e.g. Tableau Looker Power BI) for report validation and testing.
Preferred Qualifications
- Experience with big data technologies.
- Knowledge of data governance metadata management and master data management principles.
- Familiarity with performance testing and optimization techniques for large-scale data systems.
- Experience working in an Agile/Scrum development environment.
- Technical leadership for steering testing efforts.
Required Experience:
IC
We are seeking a highly motivated and skilled Quality Engineer to join our Data Engineering team in Hyderabad India. This individual will play a critical role in ensuring the accuracy reliability and integrity of our central data lake and reporting products. The ideal candidate will be passionate ab...
We are seeking a highly motivated and skilled Quality Engineer to join our Data Engineering team in Hyderabad India. This individual will play a critical role in ensuring the accuracy reliability and integrity of our central data lake and reporting products. The ideal candidate will be passionate about data quality possess a strong understanding of data engineering principles and be adept at developing and implementing robust quality controls and testing strategies for data pipelines and reporting solutions.
Key Responsibilities
- Design and Execute Quality Strategy: Develop and implement comprehensive test strategies and quality control processes specifically for data ingestion transformation storage and consumption within the data lake and reporting ecosystem.
- Data Quality Testing: Create execute and maintain automated and manual data quality tests (including unit integration and end-to-end tests) to validate data accuracy completeness consistency and conformity with business rules.
- Pipeline Validation: Validate ETL/ELT data pipelines to ensure data is processed correctly and efficiently focusing on error handling performance and scalability.
- Reporting and Dashboard Verification: Thoroughly test and validate the data displayed in business intelligence (BI) reports and dashboards against source data to guarantee accuracy and reliability for business decision-making.
- Quality Gates & Monitoring: Implement quality gates within the CI/CD pipeline and establish data quality monitoring dashboards to proactively identify and alert on data anomalies and pipeline failures.
- Collaboration: Work closely with Data Engineers Data Scientists and Product Managers to understand data requirements define quality metrics and integrate quality assurance throughout the data development lifecycle.
- Documentation: Document test plans test cases and quality control processes as well as track and report on data quality metrics and issues.
Required Qualifications
- 5 years of experience in Quality Assurance or Quality Engineering with a focus on data-centric testing or data warehousing environments.
- Strong proficiency in SQL for data querying and validation.
- Experience in developing test automation frameworks and scripts for data pipelines using tools/languages like Python PHP or Javascript.
- Solid understanding of cloud-based data warehousing and data lake concepts (e.g. AWS S3/Glue Azure Data Lake/Synapse Google Cloud Storage/BigQuery).
- Familiarity with ETL/ELT tools and data pipeline orchestration (e.g. Apache Airflow dbt Talend).
- Experience with BI tools (e.g. Tableau Looker Power BI) for report validation and testing.
Preferred Qualifications
- Experience with big data technologies.
- Knowledge of data governance metadata management and master data management principles.
- Familiarity with performance testing and optimization techniques for large-scale data systems.
- Experience working in an Agile/Scrum development environment.
- Technical leadership for steering testing efforts.
Required Experience:
IC
View more
View less