HM Note: This onsite contract role is in office every day at the managers discretion. Candidate resumes must include first and last name email and telephone contact information.
Description
Scope:
- The Office of the Public Guardian and Trustee (OPGT) requires a Senior Software Developer ETL to validate the data migration from the existing OPGT legacy applications to the new Dynamics 365 OPGT solution.
and nbsp;
Assignment Deliverables:
As a member of the data migration team you will be responsible for validating the quality of the data migration from the legacy application to Dynamics in preparation for go-live. A high-level list of deliverables follows:
- Data Analysis: analyze the existing data in the legacy applications understand its structure quality and relationships.
- Data Mapping and Transformation: read the existing data migration code to understand the mapping of the data elements from the legacy application to the corresponding entities and fields in Dynamics 365 CE
- Data Testing and Quality Assurance: conduct thorough testing to verify the accuracy and integrity of the migrated data; define test cases perform data reconciliation and address any issues or discrepancies that arise during the testing phase; develop KPIs to report on the progress completeness and quality of the data migration effort.
- Data Migration Test Plans: Develop comprehensive test plans outlining the testing approach scope objectives and the resources required for data migration quality assurance.
- Data Migration Test Cases: Create detailed test cases and test scripts covering all aspects of data migration including data extraction transformation loading and validation.
- Data Fix Development: Modify data migration code to address data migration issues.
- Documentation: Maintain detailed documentation of test cases test results and any modifications made to the test plan during the project.
- Ongoing Support: provide post-migration support analyze and address data-related issues or questions; help optimize data management processes in the new environment.
- Other duties as assigned.
The Vendors Personnel will also be required to:
- Complete work and achieve milestones within the assigned deadlines.
- Notify the Cluster/Ministry project Manager in writing of any issues or other material concerns related to the Assignment Deliverables as soon as he/she becomes aware of them.
- Submit Deliverables for the Cluster/Ministry approval as they are completed.
- Comply with the Ontario Government and the Cluster/Ministry security procedures and practices.
- Comply with the Ontario Government and the Cluster/Ministry architecture/technology standards and best practices.
- Comply with the Ontario Government and the Cluster/Ministry Conflict of Interest and Confidentiality Guidelines.
- Provide knowledge and skill transfer to a designated Cluster/Ministry staff; and comply with the Ontario Government I and amp;IT Directive Operational Policy on the I and amp;IT Project Gateway Process and other applicable Guidelines Standards and Procedures.
Skills
Experience and Skillset Requirements
and nbsp;
Mandatory Requirements
- 5 years of proven working experience in an ETL role; strong understanding of ETL principles including data extraction transformation and loading processes; knowledge of common ETL design patterns. Understanding of data pipeline architectures Azure workflow orchestration tools and concepts related to data ingestion transformation and movement.
- Proficiency in Azure Data Factory Azure Synapse workspaces PolyBase including knowledge of pipeline creation data flows integration runtimes triggers and monitoring. and nbsp;
- Knowledge of integration technologies commonly used with Dynamics such as DataVerse / Common Data Service (CDS) Data Entities and APIs.
Nice to Have Requirements
- Azure cloud certifications (e.g. Azure fundamentals Azure Data Engineer associate Azure Database Administrator associate)
- Experience with PowerApps platform Power Automate Dynamics CE and F and amp;O
Desired Skills and Experience
- 5 years of proven working experience in an ETL role; strong understanding of ETL principles including data extraction transformation and loading processes; knowledge of common ETL design patterns. Understanding of data pipeline architectures Azure workflow orchestration tools and concepts related to data ingestion transformation and movement.
- Experience in integrating various data sources and systems both on-premises and in the cloud using Azure ETL services or other ETL tools
- Knowledge of integration technologies commonly used with Dynamics such as DataVerse / Common Data Service (CDS) Data Entities and APIs.
- Expertise in data transformation techniques such as data cleansing aggregation enrichment and normalization using Azure cloud technologies
- Understanding of data quality management practices including data profiling data validation and error handling within ETL processes. and nbsp;
- Understanding of data governance principles data privacy regulations and experience working with high-sensitivity data and knowledge of best practices for data security and compliance in Azure. and nbsp;
- Ability to monitor and troubleshoot ETL processes optimize query performance and implement efficient data processing techniques in Azure.
- Proficiency in Azure Data Factory Azure Synapse workspaces PolyBase including knowledge of pipeline creation data flows integration runtimes triggers and monitoring. and nbsp;
- Strong SQL skills and experience working with Azure SQL Databases and Dataverse; good understanding of Azure storage concepts and technologies.
- Proficiency in scripting languages like Python and experience with Azure-specific scripting using PowerShell or Azure CLI. and nbsp;
- Expert in data manipulation languages (T-SQL PL/SQL) data definition languages physical database design data modelling query performance analysis and amp; tuning
- Familiarity with version control systems (e.g. Azure Repos) and collaboration tools (e.g. Azure DevOps) for managing code tracking changes and collaborating with team members.
- Experience with continuous integration/continuous deployment (CI/CD) processes around DevOps data workflows Synapse workspaces.
- Experience with SQL Server Management Studio Azure data management tools XRM toolbox data modeling tools (preferably ERWIN). and nbsp;
Resumes Evaluation/Criteria: and nbsp;
Criteria 1: and nbsp;Data Migration ETL - 40 Points
- Demonstrated experience with ETL development data pipelines workflow orchestration and data ingestion transformation and movement
- Demonstrated experience in integrating various data sources and systems both on-premises and in the cloud using Azure ETL services or other ETL tools
- Demonstrated experience working with Azure Data Factory Azure Synapse workspaces PolyBase including knowledge of pipeline creation data flows integration runtimes triggers and monitoring. and nbsp;
- Demonstrated experience with data manipulation languages (T-SQL PL/SQL) data definition languages query performance analysis and amp; tuning
- Demonstrated experience with SQL Server Oracle Azure SQL Databases
- Demonstrated experience with data modeling tools (preferably ERWIN)
- Demonstrated experience in scripting languages like Python and with Azure-specific scripting using PowerShell or Azure CLI. and nbsp;
- Experience with software development lifecycle
- Experience with data modeling physical database design data flow diagrams
Criteria 2: and nbsp;Azure Platform - 20 Points
- Experience with Azure Data Factory (ADF) and Synapse Workspaces
- Demonstrated experience Azure data management tools DevOps Synapse Studio
- Experience in Azure resource configuration and administration such as Azure Data Lake Blob Storage Key Vault Application Insight resources resource groups and subscriptions.
- Familiar with Azure cloud platform
- Azure cloud certifications
Criteria 3: and nbsp;Dynamics 365 - 20 Points
- Demonstrated experience working with integration technologies commonly used with Dynamics such as DataVerse / Common Data Service (CDS) Data Entities and APIs.
- Demonstrated experience with PowerApps platform Power Automate Dynamics CE and amp;F and amp;O
and nbsp;
Criteria 4: and nbsp;DevOps and CI/CD - 20 Points
- Demonstrated experience with continuous integration/continuous deployment (CI/CD) tools and processes around DevOps data workflows Synapse workspaces.
and nbsp;
Knowledge Transfer and nbsp;
What needs to be KT
- Document the tasks executed and in process of execution as a member of the data migration team using the approach and tools required by the project manager.
and nbsp;
To whom
- Project Manager / Team members
and nbsp;
When
- 1:1 meetings / team meetings / documentation in DevOps Wiki and boards throughout the duration of the project life cycle.
Must-haves:
- 5 years of proven working experience in an ETL role; strong understanding of ETL principles including data extraction transformation and loading processes; knowledge of common ETL design patterns. Understanding of data pipeline architectures Azure workflow orchestration tools and concepts related to data ingestion transformation and movement.
- Proficiency in Azure Data Factory Azure Synapse workspaces PolyBase including knowledge of pipeline creation data flows integration runtimes triggers and monitoring.
- Knowledge of integration technologies commonly used with Dynamics such as DataVerse / Common Data Service (CDS) Data Entities and APIs.
Nice to have:
- Azure cloud certifications (e.g. Azure fundamentals Azure Data Engineer associate Azure Database Administrator associate)
- Experience with PowerApps platform Power Automate Dynamics CE and F and amp;O
HM Note: This onsite contract role is in office every day at the managers discretion. Candidate resumes must include first and last name email and telephone contact information.DescriptionScope:The Office of the Public Guardian and Trustee (OPGT) requires a Senior Software Developer ETL to validate...
HM Note: This onsite contract role is in office every day at the managers discretion. Candidate resumes must include first and last name email and telephone contact information.
Description
Scope:
- The Office of the Public Guardian and Trustee (OPGT) requires a Senior Software Developer ETL to validate the data migration from the existing OPGT legacy applications to the new Dynamics 365 OPGT solution.
and nbsp;
Assignment Deliverables:
As a member of the data migration team you will be responsible for validating the quality of the data migration from the legacy application to Dynamics in preparation for go-live. A high-level list of deliverables follows:
- Data Analysis: analyze the existing data in the legacy applications understand its structure quality and relationships.
- Data Mapping and Transformation: read the existing data migration code to understand the mapping of the data elements from the legacy application to the corresponding entities and fields in Dynamics 365 CE
- Data Testing and Quality Assurance: conduct thorough testing to verify the accuracy and integrity of the migrated data; define test cases perform data reconciliation and address any issues or discrepancies that arise during the testing phase; develop KPIs to report on the progress completeness and quality of the data migration effort.
- Data Migration Test Plans: Develop comprehensive test plans outlining the testing approach scope objectives and the resources required for data migration quality assurance.
- Data Migration Test Cases: Create detailed test cases and test scripts covering all aspects of data migration including data extraction transformation loading and validation.
- Data Fix Development: Modify data migration code to address data migration issues.
- Documentation: Maintain detailed documentation of test cases test results and any modifications made to the test plan during the project.
- Ongoing Support: provide post-migration support analyze and address data-related issues or questions; help optimize data management processes in the new environment.
- Other duties as assigned.
The Vendors Personnel will also be required to:
- Complete work and achieve milestones within the assigned deadlines.
- Notify the Cluster/Ministry project Manager in writing of any issues or other material concerns related to the Assignment Deliverables as soon as he/she becomes aware of them.
- Submit Deliverables for the Cluster/Ministry approval as they are completed.
- Comply with the Ontario Government and the Cluster/Ministry security procedures and practices.
- Comply with the Ontario Government and the Cluster/Ministry architecture/technology standards and best practices.
- Comply with the Ontario Government and the Cluster/Ministry Conflict of Interest and Confidentiality Guidelines.
- Provide knowledge and skill transfer to a designated Cluster/Ministry staff; and comply with the Ontario Government I and amp;IT Directive Operational Policy on the I and amp;IT Project Gateway Process and other applicable Guidelines Standards and Procedures.
Skills
Experience and Skillset Requirements
and nbsp;
Mandatory Requirements
- 5 years of proven working experience in an ETL role; strong understanding of ETL principles including data extraction transformation and loading processes; knowledge of common ETL design patterns. Understanding of data pipeline architectures Azure workflow orchestration tools and concepts related to data ingestion transformation and movement.
- Proficiency in Azure Data Factory Azure Synapse workspaces PolyBase including knowledge of pipeline creation data flows integration runtimes triggers and monitoring. and nbsp;
- Knowledge of integration technologies commonly used with Dynamics such as DataVerse / Common Data Service (CDS) Data Entities and APIs.
Nice to Have Requirements
- Azure cloud certifications (e.g. Azure fundamentals Azure Data Engineer associate Azure Database Administrator associate)
- Experience with PowerApps platform Power Automate Dynamics CE and F and amp;O
Desired Skills and Experience
- 5 years of proven working experience in an ETL role; strong understanding of ETL principles including data extraction transformation and loading processes; knowledge of common ETL design patterns. Understanding of data pipeline architectures Azure workflow orchestration tools and concepts related to data ingestion transformation and movement.
- Experience in integrating various data sources and systems both on-premises and in the cloud using Azure ETL services or other ETL tools
- Knowledge of integration technologies commonly used with Dynamics such as DataVerse / Common Data Service (CDS) Data Entities and APIs.
- Expertise in data transformation techniques such as data cleansing aggregation enrichment and normalization using Azure cloud technologies
- Understanding of data quality management practices including data profiling data validation and error handling within ETL processes. and nbsp;
- Understanding of data governance principles data privacy regulations and experience working with high-sensitivity data and knowledge of best practices for data security and compliance in Azure. and nbsp;
- Ability to monitor and troubleshoot ETL processes optimize query performance and implement efficient data processing techniques in Azure.
- Proficiency in Azure Data Factory Azure Synapse workspaces PolyBase including knowledge of pipeline creation data flows integration runtimes triggers and monitoring. and nbsp;
- Strong SQL skills and experience working with Azure SQL Databases and Dataverse; good understanding of Azure storage concepts and technologies.
- Proficiency in scripting languages like Python and experience with Azure-specific scripting using PowerShell or Azure CLI. and nbsp;
- Expert in data manipulation languages (T-SQL PL/SQL) data definition languages physical database design data modelling query performance analysis and amp; tuning
- Familiarity with version control systems (e.g. Azure Repos) and collaboration tools (e.g. Azure DevOps) for managing code tracking changes and collaborating with team members.
- Experience with continuous integration/continuous deployment (CI/CD) processes around DevOps data workflows Synapse workspaces.
- Experience with SQL Server Management Studio Azure data management tools XRM toolbox data modeling tools (preferably ERWIN). and nbsp;
Resumes Evaluation/Criteria: and nbsp;
Criteria 1: and nbsp;Data Migration ETL - 40 Points
- Demonstrated experience with ETL development data pipelines workflow orchestration and data ingestion transformation and movement
- Demonstrated experience in integrating various data sources and systems both on-premises and in the cloud using Azure ETL services or other ETL tools
- Demonstrated experience working with Azure Data Factory Azure Synapse workspaces PolyBase including knowledge of pipeline creation data flows integration runtimes triggers and monitoring. and nbsp;
- Demonstrated experience with data manipulation languages (T-SQL PL/SQL) data definition languages query performance analysis and amp; tuning
- Demonstrated experience with SQL Server Oracle Azure SQL Databases
- Demonstrated experience with data modeling tools (preferably ERWIN)
- Demonstrated experience in scripting languages like Python and with Azure-specific scripting using PowerShell or Azure CLI. and nbsp;
- Experience with software development lifecycle
- Experience with data modeling physical database design data flow diagrams
Criteria 2: and nbsp;Azure Platform - 20 Points
- Experience with Azure Data Factory (ADF) and Synapse Workspaces
- Demonstrated experience Azure data management tools DevOps Synapse Studio
- Experience in Azure resource configuration and administration such as Azure Data Lake Blob Storage Key Vault Application Insight resources resource groups and subscriptions.
- Familiar with Azure cloud platform
- Azure cloud certifications
Criteria 3: and nbsp;Dynamics 365 - 20 Points
- Demonstrated experience working with integration technologies commonly used with Dynamics such as DataVerse / Common Data Service (CDS) Data Entities and APIs.
- Demonstrated experience with PowerApps platform Power Automate Dynamics CE and amp;F and amp;O
and nbsp;
Criteria 4: and nbsp;DevOps and CI/CD - 20 Points
- Demonstrated experience with continuous integration/continuous deployment (CI/CD) tools and processes around DevOps data workflows Synapse workspaces.
and nbsp;
Knowledge Transfer and nbsp;
What needs to be KT
- Document the tasks executed and in process of execution as a member of the data migration team using the approach and tools required by the project manager.
and nbsp;
To whom
- Project Manager / Team members
and nbsp;
When
- 1:1 meetings / team meetings / documentation in DevOps Wiki and boards throughout the duration of the project life cycle.
Must-haves:
- 5 years of proven working experience in an ETL role; strong understanding of ETL principles including data extraction transformation and loading processes; knowledge of common ETL design patterns. Understanding of data pipeline architectures Azure workflow orchestration tools and concepts related to data ingestion transformation and movement.
- Proficiency in Azure Data Factory Azure Synapse workspaces PolyBase including knowledge of pipeline creation data flows integration runtimes triggers and monitoring.
- Knowledge of integration technologies commonly used with Dynamics such as DataVerse / Common Data Service (CDS) Data Entities and APIs.
Nice to have:
- Azure cloud certifications (e.g. Azure fundamentals Azure Data Engineer associate Azure Database Administrator associate)
- Experience with PowerApps platform Power Automate Dynamics CE and F and amp;O
View more
View less