Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailGeneral Skills:
Technical Experience (30%)
- Proficiency in SQL and python with hands-on experience using Databricks and Spark SQL for data modeling and transformation tasks.
- Experience with at least two different platforms operating systems environments database technologies languages and communications protocols.
- Knowledge of performance considerations for different database designs in different environments.
- Knowledge and experience in information resource management tools and techniques.
Data Architecture & Modeling (50%)
- Experience in design development and implementation of data models for analytics and business Intelligence
- Knowledgeable in BI modelling methodologies (Inmon Kimball data vault) data mapping data warehouse data lake and data lakehouse for enterprise.
- Strong understanding of data quality principles with the ability to design and implement automated data quality checks using tools such as Python and SQL ensuring data integrity across pipelines and models.
- Experience in structured methodologies for the design development and implementation of applications.
- Experience in systems analysis and design in large or medium systems environments.
- Experience in the use of data modelling methods and tools (e.g. ERWIN VISIO PowerDesigner) including a working knowledge of metadata structures repository functions and data dictionaries.
- Experience in monitoring and enforcing data modelling/normalization standards.
- Experience in developing enterprise architecture deliverables (e.g. models).
Agile Product Development (20%)
- Experience working in an agile sprint-based development environment
- Understanding and working knowledge of iterative product development cycles (Discovery Agile Beta Live)
- Experience collaborating and sharing tasks with multiple developers on complex data product deliveries
- Experience contributing to version-controlled shared codebases using git (Azure DevOps GitHub Bitbucket) and participating in pull request code reviews.
Desirable Skills:
Experience with middleware and gateways
Experience in designing/developing an automated data distribution mechanism
Knowledge and understanding of object-oriented analysis and design techniques.
Experience in developing enterprise architecture deliverables (e.g. models) based on Ontario Government Enterprise Architecture processes and practice
Knowledge and understanding of Information Management principles concepts policies and practices
Experience creating detailed data standards to enable integration with other systems
Experience reviewing conceptual logical and physical data models for quality and adherence to standards
Knowledge and understanding of dimensional and relational data models
Knowledge and experience in information resource management tools and techniques
Must Have:
- Proficiency in SQL and python with hands-on experience using Databricks and Spark SQL for data modeling and transformation tasks.
- Experience with at least two different platforms operating systems environments database technologies languages and communications protocols.
- Experience in design development and implementation of data models for analytics and business Intelligence
- Knowledgeable in BI modelling methodologies (Inmon Kimball data vault) data mapping data warehouse data lake and data lakehouse for enterprise.
- Strong understanding of data quality principles with the ability to design and implement automated data quality checks using tools such as Python and SQL ensuring data integrity across pipelines and models.
Experience with middleware and gateways
Experience in designing/developing an automated data distribution mechanismFull-time