Data Engineering Manager

Blend360

Not Interested
Bookmark
Report This Job

profile Job Location:

Hyderabad - India

profile Monthly Salary: Not Disclosed
Posted on: 6 hours ago
Vacancies: 1 Vacancy

Job Summary

We are looking for an experienced Data Engineer to support the delivery of a large-scale enterprise systems integration programme for a leading facilities management client. Working Integration Engineers you will be responsible for the data layer of the integration connecting to source systems profiling and transforming data and ensuring clean well-structured payloads flow through the event-driven Azure Integration Hub. 

In addition to adapter-level data work you will build batch ingestion pipelines into the clients Databricks-based data platform and help establish the data interfaces required for the enterprise MDM implementation. The ideal candidate combines strong hands-on data engineering skills with practical experience connecting to complex enterprise application landscapes and working within structured delivery programmes.

Responsibilities 

Source Connectivity & Data Profiling 

  • Establish and validate connections to in-scope enterprise source systems spanning HR payroll recruitment ERP CRM procurement CAFM field service fleet and QHSE platforms covering a range of connectivity patterns including REST APIs SOAP/XML database connectors and file-based extracts 

  • Conduct data profiling across source systems to assess data quality volumes formats and structures documenting findings and working with business stakeholders to define and implement automated data quality tests 

  • Identify and escalate data quality issues that could impact integration or MDM readiness and track remediation progress against agreed thresholds prior to go-live 

Adapter Data Layer & Transformation 

  • Design and implement the data transformation logic within integration adapters including field-level mappings canonical format conversions data type handling and enrichment rules as defined in approved Integration Design Documents 

  • Build and maintain reusable transformation components that support consistent data handling across multiple integration events and domain waves reducing duplication and ensuring alignment with agreed data models 

  • Implement data validation rules within adapters to enforce mandatory field checks referential integrity and format compliance before payloads are published to the Service Bus supporting robust error handling and exception workflows 

 

 

Batch Ingestion into the Data Platform 

  • Build and maintain batch ingestion pipelines from in-scope source systems into the clients Databricks-based data platform covering Bronze (raw) Silver (cleansed and standardised) and Gold (business-ready) layers as required 

  • Configure pipeline orchestration scheduling incremental load patterns and error handling to ensure reliable repeatable data delivery into the lakehouse environment 

  • Implement data quality checks within the ingestion pipeline using the clients established data quality framework ensuring test coverage across ingested datasets and flagging exceptions for steward review 

MDM Data Interfaces 

  • Design and implement data feeds between source systems and the enterprise MDM platform supporting the ingestion of master data records for domains including Customer Supplier Employee Site and Project 

  • Work with the MDM workstream and data stewards to align source data structures with MDM domain models supporting match and merge configuration survivorship rule testing and the propagation of mastered data back to consuming systems 

  • Support the reference data wave by preparing and loading initial reference datasets into the MDM platform ensuring data is cleansed mapped and validated prior to ingestion 

Collaboration & Governance 

  • Work closely Integration Engineers to ensure the data layer of each adapter is consistent with the approved integration design and collaborate with solution architects and the MDM workstream to maintain alignment across the platform 

  • Contribute to CI/CD pipelines source control and documentation standards ensuring all data engineering artefacts are production-grade and handed over to the client team with appropriate runbooks and operational guides 


Qualifications :

  • 6 years of experience in data engineering with hands-on delivery in cloud-based integration or analytical environments 

  • Strong experience connecting to enterprise application APIs and databases including REST SOAP/XML JDBC/ODBC and file-based extraction patterns 

  • Proficiency in SQL and Python for data transformation cleansing and validation 

  • Experience building and maintaining data pipelines on Azure including familiarity with Azure Data Factory or equivalent orchestration tooling 

  • Hands-on experience with Databricks or a comparable cloud lakehouse platform including working within a layered data architecture (Bronze/Silver/Gold or equivalent) 

  • Experience with dbt (Core or Cloud) for SQL-based transformation and data modelling within a lakehouse environment 

  • Understanding of data mapping canonical data modelling and transformation design for multi-system integration landscapes 

  • Experience working to build-ready technical specifications and contributing to formal design and testing processes within a structured delivery programme 

  • Strong communication skills and ability to engage with both technical and business stakeholders on data quality and mapping decisions 


Additional Information :

  • Familiarity with enterprise MDM platforms and experience preparing or loading master data for Customer Supplier or Employee domains 

  • Experience with Azure Service Bus or event-driven integration patterns and an understanding of how data engineering fits within a broader pub/sub architecture 

  • Exposure to data governance tooling including Unity Catalog or equivalent for access control lineage and data cataloguing 

  • Familiarity with data quality testing approaches and experience implementing automated validation checks within pipelines 

  • Background in facilities management field services or similarly complex multi-system enterprise environments 

  • Experience contributing to operational handover documentation including pipeline runbooks and data dictionary maintenance 


Remote Work :

No


Employment Type :

Full-time

We are looking for an experienced Data Engineer to support the delivery of a large-scale enterprise systems integration programme for a leading facilities management client. Working Integration Engineers you will be responsible for the data layer of the integration connecting to source systems prof...
View more view more

About Company

Blend360 is an award-winning provider of data, analytics, and talent solutions for Fortune 500 companies. The company has made the Inc. 5000 list of Fastest Growing Companies every year they have been in business and has been awarded a world-class ranking in client satisfaction for th ... View more

View Profile View Profile