drjobs Data Engineer (Microsoft Fabric & Lakehouse)

Data Engineer (Microsoft Fabric & Lakehouse)

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Bangalore Urban - India

Monthly Salary drjobs

₹ 100000 - 150000

Vacancy

1 Vacancy

Job Description

Greetings from ALIQAN Technologies!!


HiringData Engineer (Microsoft Fabric & Lakehouse)for one of our client MNC


Job Title: Data Engineer (Microsoft Fabric & Lakehouse)

Location:Hybrid Bangalore India

Experience: 5 Years

Joining: Immediate

Hiring Process: One interview One case study round


About the Role-


We are looking for a skilled Data Engineer with 25 years of experience to join our dynamic team. The ideal candidate will be responsible for designing and developing scalable reusable and efficient data pipelines using modern Data Engineering platforms such as Microsoft Fabric PySpark and Data Lakehouse architectures.


You will play a key role in integrating data from diverse sources transforming it into actionable insights and ensuring high standards of data governance and quality. This role requires a strong understanding of modern data architectures pipeline observability and performance optimization.



Key Responsibilities


Design and build robust data pipelines using Microsoft Fabric components including Pipelines Notebooks (PySpark) Dataflows and Lakehouse architecture.

Ingest and transform data from a variety of sources such as cloud platforms (Azure AWS) on-prem databases SaaS platforms (e.g. Salesforce Workday) and REST/OpenAPI-based APIs.

Develop and maintain semantic models and define standardized KPIs for reporting and analytics in Power BI or equivalent BI tools.

Implement and manage Delta Tables across bronze/silver/gold layers using Lakehouse medallion architecture within OneLake or equivalent environments.

Apply metadata-driven design principles to support pipeline parameterization reusability and scalability.

Monitor debug and optimize pipeline performance; implement logging alerting and observability mechanisms.

Establish and enforce data governance policies including schema versioning data lineage tracking role-based access control (RBAC) and audit trail mechanisms.

Perform data quality checks including null detection duplicate handling schema drift management outlier identification and Slowly Changing Dimensions (SCD) type management.


Required Skills & Qualifications-


25 years of hands-on experience in Data Engineering or related fields.

Solid understanding of data lake/lakehouse architectures preferably with Microsoft Fabric or equivalent tools (e.g. Databricks Snowflake Azure Synapse).

Strong experience with PySpark SQL and working with dataflows and notebooks.

Exposure to BI tools like Power BI Tableau or equivalent for data consumption layers.

Experience with Delta Lake or similar transactional storage layers.

Familiarity with data ingestion from SaaS applications APIs and enterprise databases.

Understanding of data governance lineage and RBAC principles.

Strong analytical problem-solving and communication skills.



Nice to Have-


Prior experience with Microsoft Fabric and OneLake platform.

Knowledge of CI/CD practices in data engineering.

Experience implementing monitoring/alerting tools for data pipelines.



Why Join Us


Opportunity to work on cutting-edge data engineering solutions.

Fast-paced collaborative environment with a focus on innovation and learning.

Exposure to end-to-end data product development and deployment cycles.


Employment Type

Contract

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.