DescriptionJob Title :: Senior Data Engineer with MDM
Location :: Florham Park NJ (3-4 days onsite)
Job Type :: Full time
Role Description
We are looking for a hands-on Senior Data Engineer to build and operate data pipelines for MDM an enterprise master data foundation that produces trusted golden records supports stewardship workflows and publishes curated master data to downstream consumers. This role will engineer scalable ELT patterns using Azure Snowflake dbt and Python with strong focus on data quality reliability observability and audit-ready delivery.
Key responsibilities
Design and implement ingestion patterns from source systems into Snowflake (batch and incremental/CDC where applicable).
Build scalable landing/staging/curated layers with clear lineage and reprocessing strategies.
Implement orchestration patterns on Azure (scheduling parameterization retries idempotency).
Develop and maintain dbt models for:
ocanonical master entities and relationships
ostandardization/enrichment transformations
oexception/validation views and stewardship-ready outputs
Implement dbt best practices: modular models tests exposures documentation and CI checks.
Implement data quality rules and automated tests (dbt tests custom checks in Python where needed).
Create exception datasets/metrics to support stewardship queues and remediation.
Build reconciliation routines (source vs curated counts duplicate metrics completeness/consistency trends).
Support match/merge workflows by producing standardized inputs (keys standardized attributes dedupe candidates).
Implement survivorship logic where defined (source priority recency completeness) or enable it through curated datasets.
Produce publish datasets and change detection outputs for downstream consumption.
Optimize Snowflake workloads (warehouse sizing clustering strategies where relevant query tuning cost governance).
Build robust operational patterns backfills re-runs error handling SLAs data freshness checks.
Implement monitoring for pipeline health data freshness DQ failures and exception volumes.
Create runbooks and operational playbooks for production support and hypercare.
Participate in go-live cutover hypercare triage and transition to BAU support.
Work closely with Data Architect BSA/Data Analyst Backend/UI teams QA and DevOps.
Participate in sprint planning code reviews and architecture/design reviews.
Maintain high-quality documentation and version control (Git-based workflow).
Skills requirements
47 years of data engineering experience with strong hands-on delivery ownership
Strong expertise in Snowflake (modeling performance tuning cost control)
Strong expertise in dbt (models tests macros documentation CI)
Proficient in Python for pipeline utilities validations automation and troubleshooting
Experience implementing data quality and production monitoring practices
Strong SQL (advanced joins window functions profiling reconciliation)
Experience with GenAI is required.
Preferred skills
Experience with MDM / Master Data concepts (golden record dedupe survivorship stewardship workflows)
Experience with Azure data ecosystem tools (ADF/Synapse/Functions/Key Vault/Monitoring)
Experience with event-driven publishing or change-data outputs for downstream systems
Exposure to regulated/audit-heavy delivery environments (traceability approvals evidence)
Qualifications
Bachelors Degree in Computer Science or related science field or equivalent.
About US:
Incedo is a US-based consulting data science and technology services firm with over 2500 people helping clients from our six offices across US and India. We help our clients achieve competitive advantage through end-to-end digital transformation. Our uniqueness lies in bringing together strong engineering data science and design capabilities coupled with deep domain understanding. We combine services and products to maximize the business impact for our clients in telecom financial services product engineering and life science & healthcare industries.
Working at Incedo will provide you with an opportunity to work with industry leading client organizations deep technology and domain experts and global teams. Incedo University our learning platform provides ample learning opportunities starting with a structured onboarding program and carrying throughout various stages of your career. A variety of fun activities are also an integral part of our friendly work environment. Our flexible career paths allow you to grow into a program manager a technical architect or a domain expert based on your skills and interests.
We are an Equal Opportunity Employer
We value diversity at Incedo. We do not discriminate based on race religion color national origin gender sexual orientation age marital status veteran status or disability status.
Shubhi Srivastava
Talent Acquisition
100 Campus Dr Florham Park New Jersey 07932 US
Required Experience:
Senior IC
DescriptionJob Title :: Senior Data Engineer with MDMLocation :: Florham Park NJ (3-4 days onsite)Job Type :: Full timeRole Description We are looking for a hands-on Senior Data Engineer to build and operate data pipelines for MDM an enterprise master data foundation that produces trusted golden rec...
DescriptionJob Title :: Senior Data Engineer with MDM
Location :: Florham Park NJ (3-4 days onsite)
Job Type :: Full time
Role Description
We are looking for a hands-on Senior Data Engineer to build and operate data pipelines for MDM an enterprise master data foundation that produces trusted golden records supports stewardship workflows and publishes curated master data to downstream consumers. This role will engineer scalable ELT patterns using Azure Snowflake dbt and Python with strong focus on data quality reliability observability and audit-ready delivery.
Key responsibilities
Design and implement ingestion patterns from source systems into Snowflake (batch and incremental/CDC where applicable).
Build scalable landing/staging/curated layers with clear lineage and reprocessing strategies.
Implement orchestration patterns on Azure (scheduling parameterization retries idempotency).
Develop and maintain dbt models for:
ocanonical master entities and relationships
ostandardization/enrichment transformations
oexception/validation views and stewardship-ready outputs
Implement dbt best practices: modular models tests exposures documentation and CI checks.
Implement data quality rules and automated tests (dbt tests custom checks in Python where needed).
Create exception datasets/metrics to support stewardship queues and remediation.
Build reconciliation routines (source vs curated counts duplicate metrics completeness/consistency trends).
Support match/merge workflows by producing standardized inputs (keys standardized attributes dedupe candidates).
Implement survivorship logic where defined (source priority recency completeness) or enable it through curated datasets.
Produce publish datasets and change detection outputs for downstream consumption.
Optimize Snowflake workloads (warehouse sizing clustering strategies where relevant query tuning cost governance).
Build robust operational patterns backfills re-runs error handling SLAs data freshness checks.
Implement monitoring for pipeline health data freshness DQ failures and exception volumes.
Create runbooks and operational playbooks for production support and hypercare.
Participate in go-live cutover hypercare triage and transition to BAU support.
Work closely with Data Architect BSA/Data Analyst Backend/UI teams QA and DevOps.
Participate in sprint planning code reviews and architecture/design reviews.
Maintain high-quality documentation and version control (Git-based workflow).
Skills requirements
47 years of data engineering experience with strong hands-on delivery ownership
Strong expertise in Snowflake (modeling performance tuning cost control)
Strong expertise in dbt (models tests macros documentation CI)
Proficient in Python for pipeline utilities validations automation and troubleshooting
Experience implementing data quality and production monitoring practices
Strong SQL (advanced joins window functions profiling reconciliation)
Experience with GenAI is required.
Preferred skills
Experience with MDM / Master Data concepts (golden record dedupe survivorship stewardship workflows)
Experience with Azure data ecosystem tools (ADF/Synapse/Functions/Key Vault/Monitoring)
Experience with event-driven publishing or change-data outputs for downstream systems
Exposure to regulated/audit-heavy delivery environments (traceability approvals evidence)
Qualifications
Bachelors Degree in Computer Science or related science field or equivalent.
About US:
Incedo is a US-based consulting data science and technology services firm with over 2500 people helping clients from our six offices across US and India. We help our clients achieve competitive advantage through end-to-end digital transformation. Our uniqueness lies in bringing together strong engineering data science and design capabilities coupled with deep domain understanding. We combine services and products to maximize the business impact for our clients in telecom financial services product engineering and life science & healthcare industries.
Working at Incedo will provide you with an opportunity to work with industry leading client organizations deep technology and domain experts and global teams. Incedo University our learning platform provides ample learning opportunities starting with a structured onboarding program and carrying throughout various stages of your career. A variety of fun activities are also an integral part of our friendly work environment. Our flexible career paths allow you to grow into a program manager a technical architect or a domain expert based on your skills and interests.
We are an Equal Opportunity Employer
We value diversity at Incedo. We do not discriminate based on race religion color national origin gender sexual orientation age marital status veteran status or disability status.
Shubhi Srivastava
Talent Acquisition
100 Campus Dr Florham Park New Jersey 07932 US
Required Experience:
Senior IC
View more
View less