Data Engineer Azure Fabric Engineer Consultant
Job Summary
Interlink Cloud Advisors is looking for a Data Engineer / Azure Fabric Engineer Consultant to help clients build modern scalable data platforms on Microsoft Azure and Microsoft this client-facing role youll lead discovery design pragmatic architectures and deliver end-to-end solutions that turn raw data into trusted analytics-ready datasets for BI reporting and AI. Youll also bring AI development and agentic solution design experience to help clients operationalize intelligent applications on top of governed data. Youll work across Fabric (OneLake Lakehouse Warehouse) and Azure data services to ensure solutions are secure governed and operationally ready.
Key Responsibilities
What youll do:
- Youll lead client discovery to understand business goals data landscape constraints and success metricsthen translate findings into a clear delivery plan.
- Youll design and recommend data lake Lakehouse data warehouse and analytical data store architectures on Azure and Microsoft Fabric aligned to client needs.
- Youll implement (and/or guide implementation of) ingestion transformation and orchestration pipelines using Microsoft Fabric experiences and Azure services (e.g. Data Factory Synapse Databricks).
- Youll integrate data from operational systems APIs and third-party sources and transform raw data into curated datasets ready for consumption.
- Youll build and maintain analytics-ready data models and schemas; implement enrichment business rules and reusable patterns for consistency across projects.
- Youll enable multiple compute engines (SQL Spark/PySpark Spark SQL KQL) to run against shared datasets supporting diverse analytics scenarios.
- Youll design and build AI-enabled analytics and data products including agentic workflows (e.g. retrieval-augmented generation patterns) ensuring solutions are scalable secure and aligned with responsible AI practices.
- Youll establish data quality practices (validation reconciliation testing) and monitoring troubleshoot failures and continuously improve pipeline resiliency.
- Youll optimize transformations query performance and pipeline execution to meet client SLAs and cost objectives.
- Youll implement logging metrics and operational runbooks and support deployments cutovers and hypercare as needed.
- Youll design and implement security governance and compliance controls (access least privilege data protection lineage) across Azure and Fabric.
- Youll facilitate workshops and communicate architecture tradeoffs partnering with client stakeholders architects analysts and BI developers to deliver trusted datasets.
- Youll produce clear consulting deliverables (design docs diagrams implementation guides) and enablement materials to support adoption and smooth handoff.
Required Qualifications
- Experience designing and delivering modern analytics platforms (Lakehouse and/or data warehouse patterns) on Azure and/or Microsoft Fabric including client-facing delivery.
- Strong SQL skills with experience building transformations and dimensional/analytical models.
- Hands-on experience with Spark (PySpark/Spark SQL) and/or KQL for data engineering and analytics workloads.
- AI development experience including designing agentic solutions (multi-step/tool-using workflows) that leverage enterprise data (e.g. retrieval-augmented generation patterns).
- Experience building reliable data pipelines including orchestration scheduling error handling and automated monitoring.
- Knowledge of data quality practices (validation reconciliation testing) and incident triage/root-cause analysis.
- Working knowledge of Power BI (including semantic modeling concepts) to support end-to-end analytics delivery; Power Apps experience is a plus.
- Understanding of security and governance concepts for data platforms (access controls least privilege compliance lineage).
- Strong communication skills with the ability to present recommendations align stakeholders and manage expectations in a client environment.
Preferred Qualifications
- Direct experience with Microsoft Fabric components (OneLake Lakehouse Warehouse Data Factory in Fabric Real-Time Analytics/KQL databases) including platform administration capacity planning and standards.
- Experience with Azure services commonly used in data platforms (e.g. Azure Data Factory Synapse Databricks ADLS Gen2 Key Vault).
- Experience with Microsoft Foundry.
- Experience with Copilot Copilot Studio Azure AI services used to build AI/agentic applications (e.g. Azure OpenAI Azure AI Search) and related practices like prompt engineering evaluation and observability.
- Experience implementing CI/CD and Infrastructure as Code including environment strategy and release management for data solutions.
- Familiarity with semantic modeling and BI enablement patterns (e.g. Power BI datasets self-service analytics and governance guardrails).
- Relevant Microsoft certifications (e.g. Azure Data Engineer Fabric Analytics Engineer) and experience mentoring/enabling client teams.
Required Experience:
Contract
About Company
Interlink is the leading System Integrator for Microsoft's cloud solutions, helping customers migrate to Office 365, Azure, Intune and other services.