Job Overview
The Data Quality Engineer will ensure the accuracy consistency and reliability of data across the Microsoft Fabric Data Warehouse its integrations and connected services. The role focuses on testing and validating data pipelines event-driven processes and Azure-based integrations such as Logic Apps and Service Bus. The engineer will collaborate closely with data engineers architects and business units to maintain high-quality trustworthy data for analytics and reporting.
Key Responsibilities
- Design develop and execute data quality and validation tests for Microsoft Fabric Data Warehouse objects (tables views semantic models).
- Validate ETL/ELT pipelines including Fabric Dataflows and integrated source systems for accuracy and completeness.
- Test and monitor event-driven data processes including Azure Service Bus messages queues and topics.
- Validate Azure Logic Apps workflows and integrations that move or transform data between systems.
- Perform end-to-end testing of data ingestion transformations and delivery to downstream consumers or BI systems.
- Develop and maintain automated data quality checks reconciliation scripts and test frameworks.
- Identify analyze and report data quality issues including root cause analysis and impact assessment.
- Collaborate with Data Engineers BI teams and business units to define data quality rules metrics and acceptance criteria.
- Monitor data quality KPIs implement controls to prevent defects and ensure reliability in production pipelines.
- Document test cases workflows data quality rules and testing outcomes clearly for technical and business stakeholders.
- Familiarity with unit testing automated data validation and integration testing tools such as dbt Great Expectations tSQLt pytest Azure Logic Apps testing and Service Bus testing scripts.
- Perform unit testing and validation of Azure Logic Apps workflows and Service Bus message flows.
- Develop and maintain automated data quality checks and integration tests using tools such as dbt Great Expectations or custom scripts.
- Validate and test API endpoints and integration workflows using tools such as Postman to ensure data is accurately transmitted between systems.
- Perform end-to-end testing of event-driven processes and service integrations using Postman and automated scripts.
Qualifications :
Required Skills & Qualifications
- Strong experience in data quality testing and validation for data warehouses or large-scale analytics platforms.
- Hands-on experience with Microsoft Fabric (Data Warehouse Lakehouse Pipelines Dataflows).
- Proficiency in SQL and experience validating ETL/ELT pipelines.
- Experience with event-driven architectures and Azure integrations:
- Azure Service Bus (queues topics subscriptions)
- Azure Logic Apps for workflow automation
- Experience testing data integrations across multiple source systems.
- Familiarity with data quality concepts: completeness accuracy consistency timeliness uniqueness.
- Experience with automation and scripting (Python PySpark PowerShell).
- Understanding of data modeling concepts: star schema snowflake fact and dimension tables.
- Strong analytical problem-solving and troubleshooting skills.
- Excellent documentation and communication skills.
- Familiarity with unit testing automated data validation and integration testing tools such as dbt Great Expectations tSQLt pytest Postman Azure Logic Apps testing and Service Bus testing scripts.
Preferred / Nice to Have:
- Experience with data quality frameworks (e.g. Great Expectations).
- Knowledge of Azure Data Factory Synapse Analytics and OneLake.
- Familiarity with CI/CD pipelines for data platforms.
- Understanding of data governance metadata management and auditing standards.
Additional Information :
Soft skills
- Strong analytical skills and capacity to challenge the financial information received
- High sense of organisation and able to manage multiple tasks with strong attention to detail
- Excellent communication skills with the ability to interact with international stakeholders
- Curious proactive keen to learn and ready for new challenges
- Ability to work independently while also having a team-oriented mindset.
Languages
- Excellent knowledge of English (written and verbal communication skills)
- Knowledge of any other language is a plus (French)
Remote Work :
No
Employment Type :
Full-time
Job OverviewThe Data Quality Engineer will ensure the accuracy consistency and reliability of data across the Microsoft Fabric Data Warehouse its integrations and connected services. The role focuses on testing and validating data pipelines event-driven processes and Azure-based integrations such as...
Job Overview
The Data Quality Engineer will ensure the accuracy consistency and reliability of data across the Microsoft Fabric Data Warehouse its integrations and connected services. The role focuses on testing and validating data pipelines event-driven processes and Azure-based integrations such as Logic Apps and Service Bus. The engineer will collaborate closely with data engineers architects and business units to maintain high-quality trustworthy data for analytics and reporting.
Key Responsibilities
- Design develop and execute data quality and validation tests for Microsoft Fabric Data Warehouse objects (tables views semantic models).
- Validate ETL/ELT pipelines including Fabric Dataflows and integrated source systems for accuracy and completeness.
- Test and monitor event-driven data processes including Azure Service Bus messages queues and topics.
- Validate Azure Logic Apps workflows and integrations that move or transform data between systems.
- Perform end-to-end testing of data ingestion transformations and delivery to downstream consumers or BI systems.
- Develop and maintain automated data quality checks reconciliation scripts and test frameworks.
- Identify analyze and report data quality issues including root cause analysis and impact assessment.
- Collaborate with Data Engineers BI teams and business units to define data quality rules metrics and acceptance criteria.
- Monitor data quality KPIs implement controls to prevent defects and ensure reliability in production pipelines.
- Document test cases workflows data quality rules and testing outcomes clearly for technical and business stakeholders.
- Familiarity with unit testing automated data validation and integration testing tools such as dbt Great Expectations tSQLt pytest Azure Logic Apps testing and Service Bus testing scripts.
- Perform unit testing and validation of Azure Logic Apps workflows and Service Bus message flows.
- Develop and maintain automated data quality checks and integration tests using tools such as dbt Great Expectations or custom scripts.
- Validate and test API endpoints and integration workflows using tools such as Postman to ensure data is accurately transmitted between systems.
- Perform end-to-end testing of event-driven processes and service integrations using Postman and automated scripts.
Qualifications :
Required Skills & Qualifications
- Strong experience in data quality testing and validation for data warehouses or large-scale analytics platforms.
- Hands-on experience with Microsoft Fabric (Data Warehouse Lakehouse Pipelines Dataflows).
- Proficiency in SQL and experience validating ETL/ELT pipelines.
- Experience with event-driven architectures and Azure integrations:
- Azure Service Bus (queues topics subscriptions)
- Azure Logic Apps for workflow automation
- Experience testing data integrations across multiple source systems.
- Familiarity with data quality concepts: completeness accuracy consistency timeliness uniqueness.
- Experience with automation and scripting (Python PySpark PowerShell).
- Understanding of data modeling concepts: star schema snowflake fact and dimension tables.
- Strong analytical problem-solving and troubleshooting skills.
- Excellent documentation and communication skills.
- Familiarity with unit testing automated data validation and integration testing tools such as dbt Great Expectations tSQLt pytest Postman Azure Logic Apps testing and Service Bus testing scripts.
Preferred / Nice to Have:
- Experience with data quality frameworks (e.g. Great Expectations).
- Knowledge of Azure Data Factory Synapse Analytics and OneLake.
- Familiarity with CI/CD pipelines for data platforms.
- Understanding of data governance metadata management and auditing standards.
Additional Information :
Soft skills
- Strong analytical skills and capacity to challenge the financial information received
- High sense of organisation and able to manage multiple tasks with strong attention to detail
- Excellent communication skills with the ability to interact with international stakeholders
- Curious proactive keen to learn and ready for new challenges
- Ability to work independently while also having a team-oriented mindset.
Languages
- Excellent knowledge of English (written and verbal communication skills)
- Knowledge of any other language is a plus (French)
Remote Work :
No
Employment Type :
Full-time
View more
View less