Lead Data Engineer
Client-facing
Tech Delivery
Offshore Bridge
TECHNICAL HANDS-ON
- Hands-on design and build of Microsoft Fabric Lakehouse pipelines (Bronze Silver Gold) using PySpark Dataflows Gen2 and Fabric Data Factory - actively contributes code not just oversight.
- Deep working experience building enterprise Data Lakehouses on Azure: OneLake Delta Lake Azure Data Factory Azure Databricks or Synapse - minimum one end-to-end production delivery.
- Translates complex SQL Server source systems (schemas stored procs views SSIS packages) into Fabric-compatible ingestion and transformation patterns validating technical accuracy of all source-to-target mappings.
- Performs SQL Endpoint query tuning Delta table optimization (OPTIMIZE Z-ORDER V-Order partition pruning) and validates Gold layer performance against client-agreed load SLA targets.
CLIENT ENGAGEMENT & REQUIREMENTS
- Leads client-facing requirement collection sessions: facilitates structured discovery workshops with business owners and SQL report consumers asks the right probing questions to uncover hidden logic edge cases and unstated KPI definitions.
- Translates business language into technical specifications: converts client-described outcomes into Data Mapping Documents (DMD) source-to-target mappings and acceptance criteria that the offshore team can directly build against.
- Owns client presentations and sprint demos: prepares and delivers data pipeline walkthroughs Gold layer data previews and KPI validation results to client stakeholders - communicates technical findings in plain business language.
- Applies strong business domain understanding to challenge ambiguous requirements validate KPI logic against source data and proactively flag data quality or design issues before they reach the client.
OFFSHORE COORDINATION & DELIVERY
- Acts as the primary bridge between client and the offshore engineering team: runs daily syncs translates client feedback into actionable ADO stories unblocks offshore engineers on requirement ambiguity and ensures information flows without loss of context across time zones.
Reviews and accepts offshore deliverables before client-facing promotion: validates PySpark notebooks pipeline outputs and Gold layer tables against signed-off DMDs raises rework requests with precise technical feedback and owns the final sign-off before each sprint demo or UAT handover.
Lead Data Engineer Client-facing Tech Delivery Offshore Bridge TECHNICAL HANDS-ON Hands-on design and build of Microsoft Fabric Lakehouse pipelines (Bronze Silver Gold) using PySpark Dataflows Gen2 and Fabric Data Factory - actively contributes code not just oversight. Deep worki...
Lead Data Engineer
Client-facing
Tech Delivery
Offshore Bridge
TECHNICAL HANDS-ON
- Hands-on design and build of Microsoft Fabric Lakehouse pipelines (Bronze Silver Gold) using PySpark Dataflows Gen2 and Fabric Data Factory - actively contributes code not just oversight.
- Deep working experience building enterprise Data Lakehouses on Azure: OneLake Delta Lake Azure Data Factory Azure Databricks or Synapse - minimum one end-to-end production delivery.
- Translates complex SQL Server source systems (schemas stored procs views SSIS packages) into Fabric-compatible ingestion and transformation patterns validating technical accuracy of all source-to-target mappings.
- Performs SQL Endpoint query tuning Delta table optimization (OPTIMIZE Z-ORDER V-Order partition pruning) and validates Gold layer performance against client-agreed load SLA targets.
CLIENT ENGAGEMENT & REQUIREMENTS
- Leads client-facing requirement collection sessions: facilitates structured discovery workshops with business owners and SQL report consumers asks the right probing questions to uncover hidden logic edge cases and unstated KPI definitions.
- Translates business language into technical specifications: converts client-described outcomes into Data Mapping Documents (DMD) source-to-target mappings and acceptance criteria that the offshore team can directly build against.
- Owns client presentations and sprint demos: prepares and delivers data pipeline walkthroughs Gold layer data previews and KPI validation results to client stakeholders - communicates technical findings in plain business language.
- Applies strong business domain understanding to challenge ambiguous requirements validate KPI logic against source data and proactively flag data quality or design issues before they reach the client.
OFFSHORE COORDINATION & DELIVERY
- Acts as the primary bridge between client and the offshore engineering team: runs daily syncs translates client feedback into actionable ADO stories unblocks offshore engineers on requirement ambiguity and ensures information flows without loss of context across time zones.
Reviews and accepts offshore deliverables before client-facing promotion: validates PySpark notebooks pipeline outputs and Gold layer tables against signed-off DMDs raises rework requests with precise technical feedback and owns the final sign-off before each sprint demo or UAT handover.
View more
View less