Overview
The Client is undertaking a major digital transformation to deliver simpler more efficient and citizen-focused services. The DDD Division serves as the provinces center of excellence for modern digital delivery combining human-centered design Agile delivery and modern data practices to ensure high-quality and high-value digital services.
DDD works closely with Ministries across the GoA to support service innovation program review and digital transformation initiatives. To support this work the province is seeking up to two (2) Data Engineers to contribute to cross-functional delivery teams working on multiple concurrent projects.
Responsibilities
- Design build and maintain end-to-end data pipelines across on-premises and cloud platforms (Azure AWS GCP).
- Develop and optimize dimensional data models (star and snowflake schemas) to support analytics and reporting.
- Integrate data from SQL NoSQL APIs and flat files ensuring accuracy completeness and consistency.
- Enhance ETL/ELT workflows for performance scalability and reliability.
- Build and maintain pipelines using SSIS and Azure Data Factory (ADF) with proper error handling logging and scheduling.
- Implement CI/CD pipelines to automate deployment testing and monitoring of data workflows.
- Collaborate with stakeholders to translate business requirements into curated data marts fact tables and dimension tables that support self-service analytics.
- Analyze datasets to identify trends patterns and anomalies using statistical techniques DAX Python and R.
- Design and deliver interactive Power BI dashboards and reports including calculated measures and KPIs.
- Deliver analytics solutions iteratively within Agile teams supporting continuous improvement.
- Mentor team members and stakeholders to improve data literacy and self-service analytics capabilities.
- Provide data-driven evidence to inform corporate priorities policies and service improvements.
Must-Haves
- Bachelors degree in Computer Science Information Technology or a related field.
- 8 years of experience as a Data Engineer Data Analyst or similar role.
- 3 years designing and optimizing dimensional data models (star/snowflake).
- 3 years ensuring data quality governance and security.
- 3 years working with Microsoft Tabular Models and DAX.
- 3 years developing dashboards and reports including Power BI.
- 5 years manipulating extracting and transforming data from multiple data sources.
- 3 years building ETL/ELT solutions using SSIS and Azure Data Factory (ADF).
- 2 years using Git collaborative development workflows and CI/CD pipelines.
- 2 years performing data migrations across on-premises cloud and cross-database environments.
Nice-to-Haves
- 2 years leading or contributing to custom software or data platform development projects using formal delivery methodologies.
- 2 years experience with database technologies and data integration tools beyond core requirements.
- 1 year familiarity with Government of Alberta (GoA) IT infrastructure standards or business environment.
- 1 year exposure to AI/ML tools workflows or advanced analytics techniques relevant to public-sector use cases.
Overview The Client is undertaking a major digital transformation to deliver simpler more efficient and citizen-focused services. The DDD Division serves as the provinces center of excellence for modern digital delivery combining human-centered design Agile delivery and modern data practices to ensu...
Overview
The Client is undertaking a major digital transformation to deliver simpler more efficient and citizen-focused services. The DDD Division serves as the provinces center of excellence for modern digital delivery combining human-centered design Agile delivery and modern data practices to ensure high-quality and high-value digital services.
DDD works closely with Ministries across the GoA to support service innovation program review and digital transformation initiatives. To support this work the province is seeking up to two (2) Data Engineers to contribute to cross-functional delivery teams working on multiple concurrent projects.
Responsibilities
- Design build and maintain end-to-end data pipelines across on-premises and cloud platforms (Azure AWS GCP).
- Develop and optimize dimensional data models (star and snowflake schemas) to support analytics and reporting.
- Integrate data from SQL NoSQL APIs and flat files ensuring accuracy completeness and consistency.
- Enhance ETL/ELT workflows for performance scalability and reliability.
- Build and maintain pipelines using SSIS and Azure Data Factory (ADF) with proper error handling logging and scheduling.
- Implement CI/CD pipelines to automate deployment testing and monitoring of data workflows.
- Collaborate with stakeholders to translate business requirements into curated data marts fact tables and dimension tables that support self-service analytics.
- Analyze datasets to identify trends patterns and anomalies using statistical techniques DAX Python and R.
- Design and deliver interactive Power BI dashboards and reports including calculated measures and KPIs.
- Deliver analytics solutions iteratively within Agile teams supporting continuous improvement.
- Mentor team members and stakeholders to improve data literacy and self-service analytics capabilities.
- Provide data-driven evidence to inform corporate priorities policies and service improvements.
Must-Haves
- Bachelors degree in Computer Science Information Technology or a related field.
- 8 years of experience as a Data Engineer Data Analyst or similar role.
- 3 years designing and optimizing dimensional data models (star/snowflake).
- 3 years ensuring data quality governance and security.
- 3 years working with Microsoft Tabular Models and DAX.
- 3 years developing dashboards and reports including Power BI.
- 5 years manipulating extracting and transforming data from multiple data sources.
- 3 years building ETL/ELT solutions using SSIS and Azure Data Factory (ADF).
- 2 years using Git collaborative development workflows and CI/CD pipelines.
- 2 years performing data migrations across on-premises cloud and cross-database environments.
Nice-to-Haves
- 2 years leading or contributing to custom software or data platform development projects using formal delivery methodologies.
- 2 years experience with database technologies and data integration tools beyond core requirements.
- 1 year familiarity with Government of Alberta (GoA) IT infrastructure standards or business environment.
- 1 year exposure to AI/ML tools workflows or advanced analytics techniques relevant to public-sector use cases.
View more
View less