Data Engineer (Fabric)

Jobs For Humanity

Not Interested
Bookmark
Report This Job

profile Job Location:

Beirut - Lebanon

profile Monthly Salary: Not Disclosed
Posted on: 11 hours ago
Vacancies: 1 Vacancy

Job Summary

Job Overview

The Microsoft Fabric Data Engineer designs builds and operates modern data platforms using Microsoft Fabric. This role focuses on ingesting modeling and serving data via OneLake Lakehouse Data Warehouse Data Pipelines and Power BIdelivering trusted performant datasets and governed analytics at scale. The role collaborates closely with data architects analytics engineers BI developers and business stakeholders.

Key Responsibilities

1. Data Platform Engineering (Fabric)

  • Build and manage Lakehouses (Delta Lake) and Fabric Data Warehouses.
  • Develop Data Pipelines and Dataflows Gen2 for batch and near-real-time ingestion.
  • Create and optimize Notebook-based transformations (PySpark/SQL) and SQL stored procedures for DW workloads.
  • Implement medallion architecture (bronze/silver/gold) for scalable curation.
  • Publish certified semantic models and Power BI datasets aligned to business domains.

2. Performance & Reliability

  • Optimize storage/compute in OneLake (file formats partitioning z-ordering).
  • Tune Spark and SQL workloads (caching strategies concurrency workload isolation).
  • Implement robust retry alerting and monitoring (Fabric Monitoring Hub Metrics app).
  • Conduct end-to-end pipeline performance testing and scalability assessments.

3. Governance Security & Compliance

  • Enforce data governance with sensitivity labels row-level/column-level security and workspace roles.
  • Manage item-level permissions (Lakehouse tables DW schemas datasets) and Managed Identities for sources.
  • Apply data quality rules lineage and documentation (Descriptions Tags Owner metadata; Purview if applicable).
  • Ensure compliance with organizational standards (PII handling audit retention).

4. DevOps & Lifecycle Management

  • Use Fabric Git integration and Deployment Pipelines for CI/CD across dev/test/prod.
  • Parameterize pipelines and environments; externalize configuration and secrets (Key Vault).
  • Implement automated testing for data transformations and schemas.
  • Drive release management change control and rollback strategies.

5. Collaboration & Stakeholder Engagement

  • Partner with analytics engineers and BI teams to design star schemas semantic models and DAX measures.
  • Work with data source owners for SLAs schema change management and contracts.
  • Translate business requirements into technical designs and document architecture decisions.
  • Provide knowledge transfer best practices and support to data consumers.

Qualifications :

Required Skills & Qualifications

Technical Skills

  • Microsoft Fabric (hands-on):
    • OneLake Lakehouse (Delta) Fabric Data Warehouse Data Pipelines Dataflows Gen2 Notebooks Semantic Models/Power BI Monitoring Hub.
  • Programming & Querying:
    • PySpark SQL (T-SQL) Delta Lake operations; DAX familiarity is a plus.
  • Modeling & Architecture:
    • Dimensional modeling Data Vault or medallion patterns data quality frameworks.
  • Performance & Ops:
    • Partitioning file formats (Parquet/Delta) caching/z-ordering job orchestration monitoring.
  • DevOps:
    • Git Fabric Deployment Pipelines YAML CI/CD (GitHub Actions/Azure DevOps) IaC exposure (Bicep/Terraform for non-Fabric infra).
  • Security & Governance:
    • RLS/CLS sensitivity labels access patterns audit/logging lineage.

Preferred Qualifications

  • Experience with Power BI modeling (star schemas relationships calculation groups DAX).
  • Exposure to streaming/real-time: Eventstream Real-Time Hub KQL databases (if applicable).
  • Experience integrating with external sources (SQL Server
  • SAP Dataverse REST APIs).
  • Familiarity with Microsoft Purview for governance/lineage.
  • Certifications:
    • DP-600: Microsoft Fabric Analytics Engineer Associate (strongly preferred)
    • DP-203: Data Engineering on Microsoft Azure (nice to have)

Additional Information :

Soft skills

  • Strong analytical skills and capacity to challenge the financial information received
  • High sense of organisation and able to manage multiple tasks with strong attention to detail
  • Excellent communication skills with the ability to interact with international stakeholders
  • Curious proactive  keen to learn and ready for new challenges
  • Ability to work independently while also having a team-oriented mindset.

Languages

  • Excellent knowledge of English (written and verbal communication skills)
  • Knowledge of any other language is a plus (French)

Remote Work :

No


Employment Type :

Full-time

Job OverviewThe Microsoft Fabric Data Engineer designs builds and operates modern data platforms using Microsoft Fabric. This role focuses on ingesting modeling and serving data via OneLake Lakehouse Data Warehouse Data Pipelines and Power BIdelivering trusted performant datasets and governed analyt...
View more view more

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala

About Company

Company Logo

Jobs for Humanity paves the way to a fairer future for all by connecting historically underrepresented talent to welcoming employers. Through the combination of cutting-edge recruiting technology and expert D&I consultation, Jobs for Humanity makes inclusive hiring seamless, scalable, ... View more

View Profile View Profile