Consulting Senior Data Architect (Microsoft Fabric Focus)

Omm IT Solutions

Not Interested
Bookmark
Report This Job

profile Job Location:

Montreal - Canada

profile Monthly Salary: Not Disclosed
profile Experience Required: 10years
Posted on: 3 hours ago
Vacancies: 1 Vacancy

Job Summary

PLEASE NOTE:

  • It is a 100% Remote position in Canada.
  • Candidates Preferred from EST/CST Time Zone.

Key Responsibilities:

Microsoft Fabric Enablement (Hands-on Delivery Standardization)

  • Architect and implement Fabric solutions for data engineering (Spark) Data Factory pipelines and real-time analytics / Event streams aligned to digital product needs.
  • Build and standardize Fabric patterns for ingestion transformation and serving across workloads including operational analytics near-real-time batch and data science/ML.
  • Create repeatable reference implementations for common digital product scenarios (IoT telemetry time series transactional event fusion documents geospatial).

Unified Data Platform Architecture (Target State Roadmap)

  • Produce and maintain the target-state architecture for Fabric-based data platform capabilities.
  • Define domain-oriented data product patterns including how shared/enterprise datasets are curated and reused.
  • Establish architectural boundaries and integration guidance for shared datasets vs. product-owned datasets.

Data Modeling Standards (Conceptual / Logical / Physical)

  • Define and enforce data modeling standards and templates appropriate to Fabric Lakehouse/Warehouse patterns and product analytics needs.
  • Provide modeling guidance for high-variance data types (telemetry geospatial documents) and hybrid operational-analytics use cases.
  • Define standards for schema evolution versioning and contract-first data interfaces (where applicable).

Governance Security and Compliance by Design

  • Design and implement a governance model covering classification retention lineage and auditability.
  • Ensure compliance guardrails are built into delivery patterns and operational processes to meet GDPR ISO 27001 and data residency requirements.
  • Define and enforce Fabric access controls using Entra ID RBAC and workspace-level controls (including guidance for separation of duties and least privilege).

CI/CD Infrastructure as Code (IaC) for Fabric

  • Define and implement a CI/CD approach for Fabric artifacts as the enterprise source of truth.
  • Establish release patterns for Fabric changes (promotion strategy environment separation approvals and quality gates) aligned to platform standards.
  • Manage Fabric-related platform configuration using Terraform as the IaC approach (including reusable modules/patterns).
  • Create golden path templates and guidance that product teams can adopt with minimal friction.

Capacity Planning Cost Model and Chargeback/Show back

  • Design Fabric capacity strategy (SKU sizing workload isolation scaling model) to support multiple products reliably.
  • Define guardrails and operational practices that reduce waste and improve predictability.

Reliability Observability and Operational Readiness

  • Define reliability patterns and operational standards for data pipelines and real-time workloads.
  • Integrate logging/monitoring with Log Analytics and security monitoring with Sentinel including alerting and incident response considerations.
  • Define and operationalize data quality SLAs (freshness completeness accuracy timeliness) and embed quality checks into delivery pipelines.

Consulting Engagement Governance Forums

  • Participate in architectural governance and provide architecture review/sign-off with authority to mandate standards when necessary to protect platform integrity
  • Partner closely with platform engineering to align patterns across identity network DevOps and security.

Working Style & Mindset

  • Hands-on architect: you can design and build the critical Fabric artifacts to prove patterns.
  • Platform-oriented: you think in reusable standards templates and repeatable governance.
  • Strong consultative presence: you can advise product teams while also driving decisions and outcomes.
  • Comfortable with authority: you can mandate standards when required to protect the platform and business.
  • Documentation discipline: you produce clear ADRs standards and operating playbooks.

Engagement & Collaboration

  • Supports product teams through office hours and project-based sprints.
  • Works primarily with: Azure lead architect security architect/engineer DevOps platform engineer and product engineering teams.
  • Data ownership remains with product teams; this role defines the how (standards/patterns/governance) not centralized ownership.


Requirements

Required Deliverables:

You will be accountable for producing the following:

  • Target-state architecture
  • Data model standards: conceptual / logical / physical templates
  • Domain-oriented data product patterns and operating guidance
  • Governance model: classification retention lineage access controls
  • Fabric workspace strategy operating model (environments isolation ownership lifecycle)
  • CI/CD approach for Fabric artifacts integrated with platform guardrails
  • IaC approach using Terraform for Fabric-related configuration and
  • Cost model capacity planning strategy (SKU sizing isolation show back/chargeback)
  • Architecture Decision Records (ADRs) for key platform decisions

Qualifications:

Required Experience & Skills

  • 10 years in data architecture/data engineering roles including platform-scale design.
  • Proven Microsoft Fabric production implementations you have delivered Fabric solutions that run in production with real operational constraints.
  • Deep hands-on expertise in Fabric areas central to our rollout:
    • Data Engineering
    • Data Factory (pipelines)
    • Real-time analytics / Event Streams
  • Strong architecture capability across mixed data types: transactional telemetry/events documents geospatial time series.
  • Demonstrated experience implementing and governing:
    • Data modeling standards (conceptual/logical/physical)
    • Data governance (classification retention lineage)
    • Security patterns using Entra ID RBAC workspace-level controls
    • Compliance guardrails for GDPR ISO 27001 and data residency
  • Strong DevOps fluency:
    • CI/CD patterns and operational delivery
    • Terraform as mandatory IaC for repeatability and standardization
  • Ability to define standards and enforce guardrails while maintaining a delivery-first pragmatic approach.

Preferred

  • Experience supporting IoT and telemetry-heavy product ecosystems.
  • Experience designing data quality frameworks and SLAs for operational analytics and near-real-time processing.
  • Familiarity integrating observability/security signals into Log Analytics and Sentinel.



Required Skills:

Qualifications: Required Experience & Skills 10 years in data architecture/data engineering roles including platform-scale design. Proven Microsoft Fabric production implementations you have delivered Fabric solutions that run in production with real operational constraints. Deep hands-on expertise in Fabric areas central to our rollout: Data Engineering Data Factory (pipelines) Real-time analytics / Event Streams Strong architecture capability across mixed data types: transactional telemetry/events documents geospatial time series. Demonstrated experience implementing and governing: Data modeling standards (conceptual/logical/physical) Data governance (classification retention lineage) Security patterns using Entra ID RBAC workspace-level controls Compliance guardrails for GDPR ISO 27001 and data residency Strong DevOps fluency: CI/CD patterns and operational delivery Terraform as mandatory IaC for repeatability and standardization Ability to define standards and enforce guardrails while maintaining a delivery-first pragmatic approach.

PLEASE NOTE: It is a 100% Remote position in Canada.Candidates Preferred from EST/CST Time Zone.Key Responsibilities:Microsoft Fabric Enablement (Hands-on Delivery Standardization)Architect and implement Fabric solutions for data engineering (Spark) Data Factory pipelines and real-time analytics / ...
View more view more