HM Note: This onsite contract role is in office every day at the managers discretion. Candidate resumes must include first and last name email and telephone contact information. This role commences April 1 2026
Description
The work will initially focus on the Early Child Development (ECD) program; however resources engaged under this contract will support multiple MCCSS programs and applications over the term of the engagement. Work may include the development of new data and reporting products as required to address ministry priorities and emerging needs.
General Responsibilities
Design develop and implement ingestion framework from Oracle data source to Azure Data Lake - initial load and incremental ETL. Tools used are:
Azure Data Factory (good knowledge required) to maintain pipeline from Oracle to Azure Data Lake
Azure Databricks/PySpark (good Python/PySpark knowledge required) to build transformations of raw data into curated zone in the data lake
Azure Databricks/PySpark/SQL (good SQL knowledge required) to develop and/or troubleshoot transformations of curated data into datamart model
Review the requirements database tables and database relationships - Identify gaps and inefficiencies in current production reporting environment and provide recommendations to address them in the new platform.
Design ingestion framework and CDC - tools used are Oracle Golden Gate and Azure Data Factory
Prepare design artifacts
Work with IT partner on configuration of Golden Gate - responsible to provide direction and how to.
Maintain dynamic pipeline for ETL ingestion to add new tables and data elements
Data design - physical model mapping from data source to reporting destination.
Understand the requirements. Recommend changes to the physical model to support ETL design.
Reverse engineer and document existing SQL logic to improve design effort
Assist with data modelling and updates of source-to-target mapping documentation
Develop scripts for the physical model and update database and/or data lake structure.
Access Oracle DB SQL Server and Azure environments and nbsp;using and nbsp;SSIS SQLDeveloper Azure Data Studio Azure Data Factory Databricks and other and nbsp;tools to develop solution.
Proactively communicate with business and IT experts on any changes required to conceptual logical and physical models communicate and review timelines dependencies and risks.
Development of ETL and nbsp;strategy and and nbsp;solution for different sets of data modules
Understand the Tables and Relationships in the data model.
Create low level design documents and test cases for ETL development.
Create the workflows and pipeline design
Development and testing of data pipelines with Incremental and Full Load.
Develop high quality ETL mappings/scripts/notebooks
Develop and maintain pipeline from Oracle data source to Azure Data Lake and Databricks Sql Warehouse
Develop ETL to update datamarts built in and nbsp;Databricks Sql Warehouse
Perform unit testing
Ensure performance monitoring and improvement
Performance review data consistency checks
Troubleshoot performance issues ETL issues log activity for each pipeline and transformation.
Review and optimize overall ETL performance.
End-to-end integrated testing for Full Load and Incremental Load
Plan for Go Live Production Deployment.
Create production deployment steps.
Configure parameters scripts for go live. Test and review the instructions.
Create release documents and help build and deploy code across servers.
Go Live Support and Review after Go Live.
Review existing ETL process tools and provide recommendation on improving performance and reduce ETL timelines.
Review infrastructure and remediate issues for overall process improvement
Knowledge Transfer to Ministry staff development of documentation on the work completed.
Document work and share the ETL end-to-end design troubleshooting steps configuration and scripts review.
Transfer documents scripts and review of documents to Ministry.
Skills
Experience and Skill Set Requirements
Experience:
Experience of 7 years of working with and nbsp;SQL Server T-SQL Oracle PL/SQL development or similar relational databases (must-have)
Experience of 2 years of working with and nbsp;Azure Data Factory Databricks and Python development (must-have)
Experience building data ingestion and change data capture using Oracle Golden Gate (nice-to-have)
Experience working with building databases data warehouses and dimensional data marts and working with delta and full loads (must-have)
Experience on Data modeling and tools e.g. SAP Power Designer Visio or similar (must-have)
Experience with dimensional modeling. Experience in designing data warehouse solutions using slowly changing dimensions (must-have)
Experience working with SQL Server SSIS or other ETL tools solid knowledge and experience with SQL scripting (must-have)
Experience developing in an Agile environment
Understanding data warehouse architecture with a delta lake and dimensional model (must-have)
Ability to analyze design develop test and document ETL pipelines from detailed and high-level specifications and assist in troubleshooting.
Ability to utilize SQL to perform DDL tasks and complex queries
Good knowledge of database performance optimization techniques
Ability to assist in the requirements analysis and subsequent developments
Ability to conduct unit testing and assist in test preparations to ensure data integrity
Work closely with Designers Business Analysts and other Developers
Liaise with Project Managers Quality Assurance Analysts and Business Intelligence Consultants
Design and implement technical enhancements of Data Warehouse as required.
Skills:
7 years using ETL tools such as Microsoft SSIS stored procedures T-SQL and nbsp;(Must Have)
2 Azure Data Lake and Databricks and building Azure Data Factory and Azure Databricks pipelines and nbsp;(Must Have)
2 years Python and PySpark and nbsp;(Must Have)
Oracle Golden Gate
SQL Server
Oracle
Ability to present technical requirements to the business
Assets:
Knowledge and experience building data ingestion history change data capture using Oracle Golden Gate is an asset.
Evaluation Criteria
Design Documentation and Analysis Skills (30 points)
Demonstrated experience in creating both Functional Design Documents (FDD) and amp; Detailed Design Documents (DDD).
Experience in Fit-Gap analysis system use case reviews requirements reviews coding exercises and reviews.
Experience in the development and maintaining a plan to address contract deliverables through the identification of significant milestones and expected results with weekly status reporting.
Work with the Client and amp; Developer(s) assigned to refine/confirm Business Requirements
Participate in defect fixing testing support and development activities for ETL and reporting
Analyze and nbsp;and document solution complexity and interdependencies by function including providing support for data validation.
Development Database and ETL experience (60 points)
Demonstrated experience in database and ETL development (7 years)
Experience in developing in an agile Azure DevOps environment
Experience in application mapping to populate data warehouse and dimensional data mart schemas
Demonstrated experience in Extract Transform and amp; Load software development (7 years)
Experience in providing ongoing support on Azure pipeline/configuration and SqlServer SSIS and nbsp;development
Experience building data ingestion and change data capture using Golden Gate
Assist in the development of the pre-defined and ad-hoc reports and meet the coding and accessibility requirements.
Demonstrated experience with Oracle and SqlServer databases
Proficient in SQL and Python
Implementing logical and physical data models
Knowledge Transfer (10 points)
The Developer must have previous work experience in conducting Knowledge Transfer and training sessions ensuring the resources will receive the required knowledge to support the system. The resource must develop learning activities using review-watch-do methodology and amp; demonstrate the ability to prepare and present.
Development of documentation and materials as part of a review and knowledge transfer to other members
Development and facilitation of classroom based or virtual instructor led demo sessions for Developers
Monitor identified milestones and submission of status reports to ensure Knowledge Transfer is fully completed
Must Have:
7 years using ETL tools such as Microsoft SSIS stored procedures T-SQL and nbsp;(Must Have)
2 Azure Data Lake and Databricks and building Azure Data Factory and Azure Databricks pipelines and nbsp;(Must Have)
2 years Python and PySpark and nbsp;(Must Have)
Oracle Golden Gate
SQL Server
Oracle
Ability to present technical requirements to the business
HM Note: This onsite contract role is in office every day at the managers discretion. Candidate resumes must include first and last name email and telephone contact information. This role commences April 1 2026DescriptionThe work will initially focus on the Early Child Development (ECD) program; ho...
HM Note: This onsite contract role is in office every day at the managers discretion. Candidate resumes must include first and last name email and telephone contact information. This role commences April 1 2026
Description
The work will initially focus on the Early Child Development (ECD) program; however resources engaged under this contract will support multiple MCCSS programs and applications over the term of the engagement. Work may include the development of new data and reporting products as required to address ministry priorities and emerging needs.
General Responsibilities
Design develop and implement ingestion framework from Oracle data source to Azure Data Lake - initial load and incremental ETL. Tools used are:
Azure Data Factory (good knowledge required) to maintain pipeline from Oracle to Azure Data Lake
Azure Databricks/PySpark (good Python/PySpark knowledge required) to build transformations of raw data into curated zone in the data lake
Azure Databricks/PySpark/SQL (good SQL knowledge required) to develop and/or troubleshoot transformations of curated data into datamart model
Review the requirements database tables and database relationships - Identify gaps and inefficiencies in current production reporting environment and provide recommendations to address them in the new platform.
Design ingestion framework and CDC - tools used are Oracle Golden Gate and Azure Data Factory
Prepare design artifacts
Work with IT partner on configuration of Golden Gate - responsible to provide direction and how to.
Maintain dynamic pipeline for ETL ingestion to add new tables and data elements
Data design - physical model mapping from data source to reporting destination.
Understand the requirements. Recommend changes to the physical model to support ETL design.
Reverse engineer and document existing SQL logic to improve design effort
Assist with data modelling and updates of source-to-target mapping documentation
Develop scripts for the physical model and update database and/or data lake structure.
Access Oracle DB SQL Server and Azure environments and nbsp;using and nbsp;SSIS SQLDeveloper Azure Data Studio Azure Data Factory Databricks and other and nbsp;tools to develop solution.
Proactively communicate with business and IT experts on any changes required to conceptual logical and physical models communicate and review timelines dependencies and risks.
Development of ETL and nbsp;strategy and and nbsp;solution for different sets of data modules
Understand the Tables and Relationships in the data model.
Create low level design documents and test cases for ETL development.
Create the workflows and pipeline design
Development and testing of data pipelines with Incremental and Full Load.
Develop high quality ETL mappings/scripts/notebooks
Develop and maintain pipeline from Oracle data source to Azure Data Lake and Databricks Sql Warehouse
Develop ETL to update datamarts built in and nbsp;Databricks Sql Warehouse
Perform unit testing
Ensure performance monitoring and improvement
Performance review data consistency checks
Troubleshoot performance issues ETL issues log activity for each pipeline and transformation.
Review and optimize overall ETL performance.
End-to-end integrated testing for Full Load and Incremental Load
Plan for Go Live Production Deployment.
Create production deployment steps.
Configure parameters scripts for go live. Test and review the instructions.
Create release documents and help build and deploy code across servers.
Go Live Support and Review after Go Live.
Review existing ETL process tools and provide recommendation on improving performance and reduce ETL timelines.
Review infrastructure and remediate issues for overall process improvement
Knowledge Transfer to Ministry staff development of documentation on the work completed.
Document work and share the ETL end-to-end design troubleshooting steps configuration and scripts review.
Transfer documents scripts and review of documents to Ministry.
Skills
Experience and Skill Set Requirements
Experience:
Experience of 7 years of working with and nbsp;SQL Server T-SQL Oracle PL/SQL development or similar relational databases (must-have)
Experience of 2 years of working with and nbsp;Azure Data Factory Databricks and Python development (must-have)
Experience building data ingestion and change data capture using Oracle Golden Gate (nice-to-have)
Experience working with building databases data warehouses and dimensional data marts and working with delta and full loads (must-have)
Experience on Data modeling and tools e.g. SAP Power Designer Visio or similar (must-have)
Experience with dimensional modeling. Experience in designing data warehouse solutions using slowly changing dimensions (must-have)
Experience working with SQL Server SSIS or other ETL tools solid knowledge and experience with SQL scripting (must-have)
Experience developing in an Agile environment
Understanding data warehouse architecture with a delta lake and dimensional model (must-have)
Ability to analyze design develop test and document ETL pipelines from detailed and high-level specifications and assist in troubleshooting.
Ability to utilize SQL to perform DDL tasks and complex queries
Good knowledge of database performance optimization techniques
Ability to assist in the requirements analysis and subsequent developments
Ability to conduct unit testing and assist in test preparations to ensure data integrity
Work closely with Designers Business Analysts and other Developers
Liaise with Project Managers Quality Assurance Analysts and Business Intelligence Consultants
Design and implement technical enhancements of Data Warehouse as required.
Skills:
7 years using ETL tools such as Microsoft SSIS stored procedures T-SQL and nbsp;(Must Have)
2 Azure Data Lake and Databricks and building Azure Data Factory and Azure Databricks pipelines and nbsp;(Must Have)
2 years Python and PySpark and nbsp;(Must Have)
Oracle Golden Gate
SQL Server
Oracle
Ability to present technical requirements to the business
Assets:
Knowledge and experience building data ingestion history change data capture using Oracle Golden Gate is an asset.
Evaluation Criteria
Design Documentation and Analysis Skills (30 points)
Demonstrated experience in creating both Functional Design Documents (FDD) and amp; Detailed Design Documents (DDD).
Experience in Fit-Gap analysis system use case reviews requirements reviews coding exercises and reviews.
Experience in the development and maintaining a plan to address contract deliverables through the identification of significant milestones and expected results with weekly status reporting.
Work with the Client and amp; Developer(s) assigned to refine/confirm Business Requirements
Participate in defect fixing testing support and development activities for ETL and reporting
Analyze and nbsp;and document solution complexity and interdependencies by function including providing support for data validation.
Development Database and ETL experience (60 points)
Demonstrated experience in database and ETL development (7 years)
Experience in developing in an agile Azure DevOps environment
Experience in application mapping to populate data warehouse and dimensional data mart schemas
Demonstrated experience in Extract Transform and amp; Load software development (7 years)
Experience in providing ongoing support on Azure pipeline/configuration and SqlServer SSIS and nbsp;development
Experience building data ingestion and change data capture using Golden Gate
Assist in the development of the pre-defined and ad-hoc reports and meet the coding and accessibility requirements.
Demonstrated experience with Oracle and SqlServer databases
Proficient in SQL and Python
Implementing logical and physical data models
Knowledge Transfer (10 points)
The Developer must have previous work experience in conducting Knowledge Transfer and training sessions ensuring the resources will receive the required knowledge to support the system. The resource must develop learning activities using review-watch-do methodology and amp; demonstrate the ability to prepare and present.
Development of documentation and materials as part of a review and knowledge transfer to other members
Development and facilitation of classroom based or virtual instructor led demo sessions for Developers
Monitor identified milestones and submission of status reports to ensure Knowledge Transfer is fully completed
Must Have:
7 years using ETL tools such as Microsoft SSIS stored procedures T-SQL and nbsp;(Must Have)
2 Azure Data Lake and Databricks and building Azure Data Factory and Azure Databricks pipelines and nbsp;(Must Have)
2 years Python and PySpark and nbsp;(Must Have)
Oracle Golden Gate
SQL Server
Oracle
Ability to present technical requirements to the business