Vice President, Data Quality Lead Engineer

BlackRock

Not Interested
Bookmark
Report This Job

profile Job Location:

Mumbai - India

profile Monthly Salary: Not Disclosed
Posted on: 7 hours ago
Vacancies: 1 Vacancy

Job Summary

About this role

BlackRock is seeking a Data Quality Framework Lead to lead the strategy architecture and delivery of a core capability within Enterprise Data Platform in Aladdin Data. This role combines platform engineering data governance and stakeholder leadership to build a scalable trusted and transparent framework for data quality across the firm.

The platform ensures that the data BlackRock relies on is fit for purpose across key dimensions including completeness accuracy timeliness consistency validity and integrity. It provides clear and actionable quality signals to upstream producers downstream systems and end users so data can be used confidently for decisions at scale.

The framework uses custom Python operators Great Expectations and Airflow-orchestrated pipelines to perform quality checks as data moves through the ecosystem. The ideal candidate brings strong technical depth sound architectural judgment hands-on execution and the ability to align stakeholders around a common platform vision.

What You Will Do

  • Lead the evolution of BlackRocks data quality framework as a strategic platform capability for validating and monitoring data across the Aladdin Data ecosystem.
  • Define the technical direction for a metadata-driven framework that supports reusable quality rules policy enforcement exception handling quality scoring and domain-level service standards.
  • Design and deliver controls that run within Airflow-orchestrated pipelines enabling early detection of issues before they affect downstream systems or clients.
  • Build a strong operating model for observability transparency and remediation so producers and consumers can identify and resolve issues quickly.
  • Partner with engineering product governance and business stakeholders to drive adoption prioritization and long-term roadmap execution.

Key Responsibilities

  • Own the target-state architecture for the Data Quality Framework including rule execution patterns validation layers quality gates exception workflows and extensibility standards.
  • Build and scale platform services libraries and APIs for rule authoring execution scoring auditability and quality SLA and SLO reporting across datasets and domains.
  • Develop controls across core quality dimensions including completeness accuracy timeliness consistency validity uniqueness and referential integrity.
  • Design and implement profiling anomaly detection and drift detection capabilities covering schema changes null patterns distribution shifts outliers volume trends and freshness checks.
  • Implement reconciliation and financial control patterns such as source-to-target checks row-count balancing aggregate validation hashes and critical total checks.
  • Drive adoption of Great Expectations and custom Python operators to standardize how assertions are defined executed versioned and reused across pipelines.
  • Integrate the framework into Airflow-based data pipelines so checks run at the right control points with meaningful alerting and triage.
  • Establish metadata-driven rule management including ownership lineage versioning parameterization execution history and audit-ready evidence.
  • Optimize framework performance across high-volume environments particularly Snowflake and MSSQL balancing control rigor with runtime efficiency.
  • Create clear visibility for downstream platforms internal users and clients through dashboards scorecards status indicators and actionable exception reporting.
  • Mentor engineers and act as a senior technical leader who can make pragmatic architecture decisions while staying hands-on when needed.
  • Influence enterprise standards for trusted data consumption in partnership with data governance platform engineering and product teams.

Required Qualifications

  • At least 10 years of experience in backend data platform or data engineering roles with a strong record of hands-on technical delivery.
  • Deep expertise in Python and experience building reusable engineering frameworks services or platform capabilities.
  • Strong experience with workflow orchestration and pipeline integration ideally with Airflow in complex enterprise environments.
  • Proven experience designing and implementing data quality controls across batch and or near-real-time data pipelines.
  • Strong understanding of enterprise data quality operating models including SLAs and SLOs exception handling issue triage and remediation workflows.
  • Hands-on experience with Great Expectations or similar data quality frameworks with the ability to extend them through custom engineering patterns.
  • Strong proficiency with Snowflake and or MSSQL including query tuning scalable control execution and performance optimization.
  • Experience with metadata-driven platform design data modeling lineage traceability and audit-ready control frameworks.
  • Demonstrated ability to make architecture decisions influence platform direction and communicate effectively with senior technical and business stakeholders.
  • Bachelors or Masters degree in Computer Science Engineering or a related discipline or equivalent practical experience.

Preferred Qualifications

  • Experience building enterprise data quality observability or control frameworks in financial services or similarly regulated environments.
  • Exposure to FastAPI or similar API frameworks for building platform services and developer-facing capabilities.
  • Experience with event-driven or streaming architectures such as Kafka for near-real-time quality detection patterns.
  • Familiarity with Docker Kubernetes CI/CD pipelines and modern software delivery practices.
  • Understanding of domain-driven design data product ownership and platform adoption across large organizations.
  • Experience leading cross-functional initiatives that require both deep technical execution and strong stakeholder management.

Technical Skills

Languages & Frameworks

Python custom Python operators FastAPI Great Expectations DBT

Orchestration & Pipelines

Airflow ETL / ELT pipelines batch and near-real-time validation patterns

Data Platforms

Snowflake MSSQL large-scale relational and analytical data stores

Quality Capabilities

Profiling validation reconciliation anomaly detection quality gates scorecards SLAs / SLOs

Platform Design

Metadata-driven frameworks rule catalogs versioning audit trails observability lineage integration

Engineering Practices

API design distributed systems performance tuning Docker Kubernetes CI/CD

Why This Role Matters

  • This role is central to strengthening trust in the data that powers Aladdin and BlackRocks broader data ecosystem.
  • You will shape a platform that improves decision quality reduces operational risk and increases transparency for internal consumers and clients.
  • You will define standards influence architecture and build a durable capability that scales with the firms growing data needs.

Our benefits

To help you stay energized engaged and inspired we offer a wide range of benefits including a strong retirement plan tuition reimbursement comprehensive healthcare support for working parents and Flexible Time Off (FTO) so you can relax recharge and be there for the people you care about.

Our hybrid work model

BlackRocks hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person aligned with our commitment to performance and innovation. As a new joiner you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock.

About BlackRock

At BlackRock we are all connected by one mission: to help more and more people experience financial well-being. Our clients and the people they serve are saving for retirement paying for their childrens educations buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress.

This mission would not be possible without our smartest investment the one we make in our employees. Its why were dedicated to creating an environment where our colleagues feel welcomed valued and supported with networks benefits and development opportunities to help them thrive.

For additional information on BlackRock please visit @blackrock Twitter: @blackrock LinkedIn: is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age disability family status gender identity race religion sex sexual orientation and other protected attributes at law.


Required Experience:

Exec

About this roleBlackRock is seeking a Data Quality Framework Lead to lead the strategy architecture and delivery of a core capability within Enterprise Data Platform in Aladdin Data. This role combines platform engineering data governance and stakeholder leadership to build a scalable trusted and tr...
View more view more

About Company

Company Logo

BlackRock is one of the world’s preeminent asset management firms and a premier provider of investment management. Find out more information here.

View Profile View Profile