MS Fabric Architect

Not Interested
Bookmark
Report This Job

profile Job Location:

Bangalore - India

profile Monthly Salary: Not Disclosed
Posted on: 30+ days ago
Vacancies: 1 Vacancy

Job Summary

Key Responsibilities
  • Design architect and implement end-to-end data solutions using Microsoft Fabric and related Azure technologies.

  • Lead Data Lakehouse implementations leveraging Medallion Architecture and best practices.

  • Collaborate with data engineers data scientists and business stakeholders to deliver scalable secure and high-performing data platforms.

  • Modernize data ecosystems by integrating on-premises and cloud data sources (e.g. SQL DB ADLS Synapse Event Hub Salesforce APIs and SFTP).

  • Define and implement metadata-driven frameworks for efficient data engineering workflows.

  • Oversee Fabric workspace setup access provisioning capacity management and cost optimization.

  • Mentor junior engineers and promote best practices in coding architecture and data governance.

  • Contribute to the development of LLM/GenAI-powered applications within the data ecosystem.

  • Ensure CI/CD automation using Azure DevOps and Fabric Deployment Pipelines.

  • Stay current with evolving data and AI technologies to recommend and adopt innovative solutions.

Skills & Experience Required
  • 8 years of overall technical experience with at least 2 years hands-on in Microsoft Fabric (preferably from an Azure Databricks/Synapse background).

  • Proven experience leading 2 end-to-end Data Lakehouse projects on Microsoft Fabric.

  • Deep expertise in Fabric components: Data Factory Notebooks PySpark Delta Live Tables Dataflow Gen2 Shortcuts Fabric Lakehouse/Warehouse Copy Job Mirroring Event Stream KQL DB Fabric SQL DB Semantic Model (optional) Fabric Data Agent.

  • Strong programming and debugging skills in Python and SQL.

  • In-depth understanding of Fabric architecture component selection and cost optimization strategies.

  • Proficiency in data modeling (Dimensional & 3NF).

  • Exposure to Neo4j Cosmos DB and vector databases is desirable.

  • Experience with LLM/GenAI applications and CI/CD pipelines (Azure DevOps).

Educational Qualification
  • Bachelors degree (B.E/) in Computer Science Information Technology or a related discipline from a reputed institute (preferred).

Key Responsibilities Design architect and implement end-to-end data solutions using Microsoft Fabric and related Azure technologies. Lead Data Lakehouse implementations leveraging Medallion Architecture and best practices. Collaborate with data engineers data scientists and business stakeholde...
View more view more

Key Skills

  • Anti Money Laundering
  • Active Directory
  • Administration Support
  • Brand Development
  • Cardiovascular
  • Blogging