Responsibilities:
- Develops and implements the data architecture for application development in a complex and distributed environment including the determination of the flow and distribution of data the location of databases and data access methods.
Requirements
Experience and Skill Set Requirements:
Must Haves:
- 7 years in data modelling and data warehouse design (Must Have)
- 2 years Azure Data Lake and Azure Databricks SQL Warehouse (Must Have)
- 5 years SQL (Must Have)
Skill Set Requirements:
Data Modeler Requirements::
- 7 years BI Data Architect experience in enterprise applications and solutions design/development and related with data warehousing data lake implementations and dimensional modelling.
- Collect business-level questions and propose approaches to address business needs and provide data insights.
- Expand documentation and knowledge of business processes relative to available data to provide contextual guidance for operation/project reporting and insights generation.
- Ability to design and articulate complex technical concepts into executable development work packages.
- Knowledge of BI tools for metadata modeling and report design (e.g. Power BI)
- MS SQL Server Technology Azure Data Lake Azure Databricks
- Expert knowledge developing data warehouse solutions on MS Stack (Azure Data Lake SQL ADF Databricks PowerBI) to store and retrieve centralized information. Experience designing the data warehouse using dimensional and delta lake concepts.
- Create/maintain enterprise data model and data dictionary. Help development team to optimize database performance. Coordinate with the Integration department to identify future needs and requirements.
- Extensive knowledge of data modelling tools (e.g. SAP PowerDesigner Visio)
- Review install and configure information systems to ensure functionality and security. Analyze structural requirements for new data warehouse and applications
- Experience using Oracle database server and tools PL/SQL for development of Business Intelligence applications.
- Demonstrated skills in writing SQL stored procedures and packages for datamarts and reporting.
- Demonstrated experience in Azure DevOps
- Demonstrated experience in performance tuning of Business Intelligence applications including data model and schema optimization
Skills:
- 7 years in data modelling and data warehouse design (Must Have)
- 2 years Azure Data Lake and Azure Databricks SQL Warehouse (Must Have)
- 5 years SQL (Must Have)
Assets:
- Knowledge of Curam IBM COTS solutions (Social Assistance Management System)
- ETL design concepts
- Knowledge of Enterprise Architecture tools and frameworks (ArchiMate TOGAF Zachmann)
Design Documentation and Analysis Skills:
- Demonstrated experience in creating both Functional Design Documents (FDD) & Detailed Design Documents (DDD).
- Experience in Fit-Gap analysis system use case reviews requirements reviews coding exercises and reviews.
- Experience in the development and maintaining a plan to address contract deliverables through the identification of significant milestones and expected results with weekly status reporting.
- Work with the Client & Developer(s) assigned to refine/confirm Business Requirements
- Participate in defect fixing testing support and development activities for ETL pipelines. Assist with defect fixing and testing support for PowerBI reports.
- Analyze and document solution complexity and interdependencies
BI Data Modelling and Technical Skills:
- Understanding of Data Modelling for Business Intelligence including:
a. Expert Knowledge of data warehouse design methodologies delta lake and dimensional modeling in particular
b. Understanding of Extract/Transform/Load processes to transform data for reporting/BI purposes
c. Ability to define schema for reporting databases
d. Experience with advanced modeling tools
- Knowledge of BI tools for metadata modeling and report design (e.g. PowerBI Cognos)
- Good knowledge and experience in MS SQL Server technology Azure Databricks SQL Warehouse Azure Data Lake
- Experience using T-SQL PL/SQL for development of Business Intelligence applications. Demonstrated skills in writing and reverse engineering SQL stored procedures and packages for datamarts and reporting.
- Demonstrated experience in performance tuning of Business Intelligence applications including data model and schema optimization
Quality Assurance:
- Demonstrated experience in defining and executing tests across the development lifecycle (unit testing system testing user acceptance testing) and using results to refine database design
Knowledge Transfer:
- The Architect / Modeler must have previous work experience in conducting Knowledge Transfer and training sessions ensuring the resources will receive the required knowledge to support the system. The resource must develop learning activities using review-watch-do methodology & demonstrate the ability to prepare and present.
- Development of documentation and materials as part of a review and knowledge transfer to other members
- Development of specific activities as part of a review (hand over to ministry staff)and building block approach which build on knowledge transfer and skills development from the previous stage to the next
- Development and facilitation of classroom based or virtual instructor demo led sessions for developers
- Monitor identified milestones and submission of status reports to ensure Knowledge Transfer is fully completed
Required Skills:
Experience and Skill Set Requirements: Must Haves: 7 years in data modelling and data warehouse design (Must Have) 2 years Azure Data Lake and Azure Databricks SQL Warehouse (Must Have) 5 years SQL (Must Have) Skill Set Requirements: Data Modeler Requirements:: 7 years BI Data Architect experience in enterprise applications and solutions design/development and related with data warehousing data lake implementations and dimensional modelling. Collect business-level questions and propose approaches to address business needs and provide data insights. Expand documentation and knowledge of business processes relative to available data to provide contextual guidance for operation/project reporting and insights generation. Ability to design and articulate complex technical concepts into executable development work packages. Knowledge of BI tools for metadata modeling and report design (e.g. Power BI) MS SQL Server Technology Azure Data Lake Azure Databricks Expert knowledge developing data warehouse solutions on MS Stack (Azure Data Lake SQL ADF Databricks PowerBI) to store and retrieve centralized information. Experience designing the data warehouse using dimensional and delta lake concepts. Create/maintain enterprise data model and data dictionary. Help development team to optimize database performance. Coordinate with the Integration department to identify future needs and requirements. Extensive knowledge of data modelling tools (e.g. SAP PowerDesigner Visio) Review install and configure information systems to ensure functionality and security. Analyze structural requirements for new data warehouse and applications Experience using Oracle database server and tools PL/SQL for development of Business Intelligence applications. Demonstrated skills in writing SQL stored procedures and packages for datamarts and reporting. Demonstrated experience in Azure DevOps Demonstrated experience in performance tuning of Business Intelligence applications including data model and schema optimization Skills: 7 years in data modelling and data warehouse design (Must Have) 2 years Azure Data Lake and Azure Databricks SQL Warehouse (Must Have) 5 years SQL (Must Have) Assets: Knowledge of Curam IBM COTS solutions (Social Assistance Management System) ETL design concepts Knowledge of Enterprise Architecture tools and frameworks (ArchiMate TOGAF Zachmann) Design Documentation and Analysis Skills: Demonstrated experience in creating both Functional Design Documents (FDD) & Detailed Design Documents (DDD). Experience in Fit-Gap analysis system use case reviews requirements reviews coding exercises and reviews. Experience in the development and maintaining a plan to address contract deliverables through the identification of significant milestones and expected results with weekly status reporting. Work with the Client & Developer(s) assigned to refine/confirm Business Requirements Participate in defect fixing testing support and development activities for ETL pipelines. Assist with defect fixing and testing support for PowerBI reports. Analyze and document solution complexity and interdependencies BI Data Modelling and Technical Skills: Understanding of Data Modelling for Business Intelligence including: a. Expert Knowledge of data warehouse design methodologies delta lake and dimensional modeling in particular b. Understanding of Extract/Transform/Load processes to transform data for reporting/BI purposes c. Ability to define schema for reporting databases d. Experience with advanced modeling tools Knowledge of BI tools for metadata modeling and report design (e.g. PowerBI Cognos) Good knowledge and experience in MS SQL Server technology Azure Databricks SQL Warehouse Azure Data Lake Experience using T-SQL PL/SQL for development of Business Intelligence applications. Demonstrated skills in writing and reverse engineering SQL stored procedures and packages for datamarts and reporting. Demonstrated experience in performance tuning of Business Intelligence applications including data model and schema optimization Quality Assurance: Demonstrated experience in defining and executing tests across the development lifecycle (unit testing system testing user acceptance testing) and using results to refine database design Knowledge Transfer: The Architect / Modeler must have previous work experience in conducting Knowledge Transfer and training sessions ensuring the resources will receive the required knowledge to support the system. The resource must develop learning activities using review-watch-do methodology & demonstrate the ability to prepare and present. Development of documentation and materials as part of a review and knowledge transfer to other members Development of specific activities as part of a review (hand over to ministry staff)and building block approach which build on knowledge transfer and skills development from the previous stage to the next Development and facilitation of classroom based or virtual instructor demo led sessions for developers Monitor identified milestones and submission of status reports to ensure Knowledge Transfer is fully completed
Responsibilities:Develops and implements the data architecture for application development in a complex and distributed environment including the determination of the flow and distribution of data the location of databases and data access methods.RequirementsExperience and Skill Set Requirements:Mus...
Responsibilities:
- Develops and implements the data architecture for application development in a complex and distributed environment including the determination of the flow and distribution of data the location of databases and data access methods.
Requirements
Experience and Skill Set Requirements:
Must Haves:
- 7 years in data modelling and data warehouse design (Must Have)
- 2 years Azure Data Lake and Azure Databricks SQL Warehouse (Must Have)
- 5 years SQL (Must Have)
Skill Set Requirements:
Data Modeler Requirements::
- 7 years BI Data Architect experience in enterprise applications and solutions design/development and related with data warehousing data lake implementations and dimensional modelling.
- Collect business-level questions and propose approaches to address business needs and provide data insights.
- Expand documentation and knowledge of business processes relative to available data to provide contextual guidance for operation/project reporting and insights generation.
- Ability to design and articulate complex technical concepts into executable development work packages.
- Knowledge of BI tools for metadata modeling and report design (e.g. Power BI)
- MS SQL Server Technology Azure Data Lake Azure Databricks
- Expert knowledge developing data warehouse solutions on MS Stack (Azure Data Lake SQL ADF Databricks PowerBI) to store and retrieve centralized information. Experience designing the data warehouse using dimensional and delta lake concepts.
- Create/maintain enterprise data model and data dictionary. Help development team to optimize database performance. Coordinate with the Integration department to identify future needs and requirements.
- Extensive knowledge of data modelling tools (e.g. SAP PowerDesigner Visio)
- Review install and configure information systems to ensure functionality and security. Analyze structural requirements for new data warehouse and applications
- Experience using Oracle database server and tools PL/SQL for development of Business Intelligence applications.
- Demonstrated skills in writing SQL stored procedures and packages for datamarts and reporting.
- Demonstrated experience in Azure DevOps
- Demonstrated experience in performance tuning of Business Intelligence applications including data model and schema optimization
Skills:
- 7 years in data modelling and data warehouse design (Must Have)
- 2 years Azure Data Lake and Azure Databricks SQL Warehouse (Must Have)
- 5 years SQL (Must Have)
Assets:
- Knowledge of Curam IBM COTS solutions (Social Assistance Management System)
- ETL design concepts
- Knowledge of Enterprise Architecture tools and frameworks (ArchiMate TOGAF Zachmann)
Design Documentation and Analysis Skills:
- Demonstrated experience in creating both Functional Design Documents (FDD) & Detailed Design Documents (DDD).
- Experience in Fit-Gap analysis system use case reviews requirements reviews coding exercises and reviews.
- Experience in the development and maintaining a plan to address contract deliverables through the identification of significant milestones and expected results with weekly status reporting.
- Work with the Client & Developer(s) assigned to refine/confirm Business Requirements
- Participate in defect fixing testing support and development activities for ETL pipelines. Assist with defect fixing and testing support for PowerBI reports.
- Analyze and document solution complexity and interdependencies
BI Data Modelling and Technical Skills:
- Understanding of Data Modelling for Business Intelligence including:
a. Expert Knowledge of data warehouse design methodologies delta lake and dimensional modeling in particular
b. Understanding of Extract/Transform/Load processes to transform data for reporting/BI purposes
c. Ability to define schema for reporting databases
d. Experience with advanced modeling tools
- Knowledge of BI tools for metadata modeling and report design (e.g. PowerBI Cognos)
- Good knowledge and experience in MS SQL Server technology Azure Databricks SQL Warehouse Azure Data Lake
- Experience using T-SQL PL/SQL for development of Business Intelligence applications. Demonstrated skills in writing and reverse engineering SQL stored procedures and packages for datamarts and reporting.
- Demonstrated experience in performance tuning of Business Intelligence applications including data model and schema optimization
Quality Assurance:
- Demonstrated experience in defining and executing tests across the development lifecycle (unit testing system testing user acceptance testing) and using results to refine database design
Knowledge Transfer:
- The Architect / Modeler must have previous work experience in conducting Knowledge Transfer and training sessions ensuring the resources will receive the required knowledge to support the system. The resource must develop learning activities using review-watch-do methodology & demonstrate the ability to prepare and present.
- Development of documentation and materials as part of a review and knowledge transfer to other members
- Development of specific activities as part of a review (hand over to ministry staff)and building block approach which build on knowledge transfer and skills development from the previous stage to the next
- Development and facilitation of classroom based or virtual instructor demo led sessions for developers
- Monitor identified milestones and submission of status reports to ensure Knowledge Transfer is fully completed
Required Skills:
Experience and Skill Set Requirements: Must Haves: 7 years in data modelling and data warehouse design (Must Have) 2 years Azure Data Lake and Azure Databricks SQL Warehouse (Must Have) 5 years SQL (Must Have) Skill Set Requirements: Data Modeler Requirements:: 7 years BI Data Architect experience in enterprise applications and solutions design/development and related with data warehousing data lake implementations and dimensional modelling. Collect business-level questions and propose approaches to address business needs and provide data insights. Expand documentation and knowledge of business processes relative to available data to provide contextual guidance for operation/project reporting and insights generation. Ability to design and articulate complex technical concepts into executable development work packages. Knowledge of BI tools for metadata modeling and report design (e.g. Power BI) MS SQL Server Technology Azure Data Lake Azure Databricks Expert knowledge developing data warehouse solutions on MS Stack (Azure Data Lake SQL ADF Databricks PowerBI) to store and retrieve centralized information. Experience designing the data warehouse using dimensional and delta lake concepts. Create/maintain enterprise data model and data dictionary. Help development team to optimize database performance. Coordinate with the Integration department to identify future needs and requirements. Extensive knowledge of data modelling tools (e.g. SAP PowerDesigner Visio) Review install and configure information systems to ensure functionality and security. Analyze structural requirements for new data warehouse and applications Experience using Oracle database server and tools PL/SQL for development of Business Intelligence applications. Demonstrated skills in writing SQL stored procedures and packages for datamarts and reporting. Demonstrated experience in Azure DevOps Demonstrated experience in performance tuning of Business Intelligence applications including data model and schema optimization Skills: 7 years in data modelling and data warehouse design (Must Have) 2 years Azure Data Lake and Azure Databricks SQL Warehouse (Must Have) 5 years SQL (Must Have) Assets: Knowledge of Curam IBM COTS solutions (Social Assistance Management System) ETL design concepts Knowledge of Enterprise Architecture tools and frameworks (ArchiMate TOGAF Zachmann) Design Documentation and Analysis Skills: Demonstrated experience in creating both Functional Design Documents (FDD) & Detailed Design Documents (DDD). Experience in Fit-Gap analysis system use case reviews requirements reviews coding exercises and reviews. Experience in the development and maintaining a plan to address contract deliverables through the identification of significant milestones and expected results with weekly status reporting. Work with the Client & Developer(s) assigned to refine/confirm Business Requirements Participate in defect fixing testing support and development activities for ETL pipelines. Assist with defect fixing and testing support for PowerBI reports. Analyze and document solution complexity and interdependencies BI Data Modelling and Technical Skills: Understanding of Data Modelling for Business Intelligence including: a. Expert Knowledge of data warehouse design methodologies delta lake and dimensional modeling in particular b. Understanding of Extract/Transform/Load processes to transform data for reporting/BI purposes c. Ability to define schema for reporting databases d. Experience with advanced modeling tools Knowledge of BI tools for metadata modeling and report design (e.g. PowerBI Cognos) Good knowledge and experience in MS SQL Server technology Azure Databricks SQL Warehouse Azure Data Lake Experience using T-SQL PL/SQL for development of Business Intelligence applications. Demonstrated skills in writing and reverse engineering SQL stored procedures and packages for datamarts and reporting. Demonstrated experience in performance tuning of Business Intelligence applications including data model and schema optimization Quality Assurance: Demonstrated experience in defining and executing tests across the development lifecycle (unit testing system testing user acceptance testing) and using results to refine database design Knowledge Transfer: The Architect / Modeler must have previous work experience in conducting Knowledge Transfer and training sessions ensuring the resources will receive the required knowledge to support the system. The resource must develop learning activities using review-watch-do methodology & demonstrate the ability to prepare and present. Development of documentation and materials as part of a review and knowledge transfer to other members Development of specific activities as part of a review (hand over to ministry staff)and building block approach which build on knowledge transfer and skills development from the previous stage to the next Development and facilitation of classroom based or virtual instructor demo led sessions for developers Monitor identified milestones and submission of status reports to ensure Knowledge Transfer is fully completed
View more
View less