Job Description: Staff Data Engineer NPS Prism
Location: India (Remote/Hybrid)
Experience: 68 Years
Employment Type: Full-time
Role Overview
We are seeking an experienced Staff Data Engineer to join the NPS Prism engineering team a Bain platform that provides advanced analytics benchmarking and insights into customer experience metrics across industries.
As a senior technical leader you will be responsible for designing building and optimizing large-scale high-performance data pipelines and architectures that power NPS Prisms analytics and client-facing applications. This role requires deep Databricks expertise proficiency in Python SQL and PySpark and the ability to work across cloud-native environments (Azure AWS or GCP).
Youll collaborate closely with data scientists product managers and business stakeholders to shape and execute the platforms data strategy ensuring data quality scalability and reliability at enterprise scale.
Key Responsibilities
Data Architecture & Engineering Leadership
Design and own scalable data architectures for ingestion transformation and analytics on Databricks.
Build robust ETL/ELT pipelines using PySpark SQL and Databricks Workflows.
Lead performance tuning partitioning and data optimization across large distributed systems.
Mentor junior data engineers and enforce best practices for code quality testing and version control.
Cloud & Platform Engineering
Develop and maintain data lakes and data warehouses on cloud platforms (Azure Data Lake AWS S3 GCP BigQuery etc.).
Utilize Azure Data Factory AWS Glue or similar orchestration tools to manage large-scale data workflows.
Integrate multiple data sources (structured semi-structured and unstructured) into unified models for NPS Prism analytics.
Databricks & Advanced Analytics Enablement
Leverage Databricks for large-scale data processing Delta Lake management and ML/AI enablement.
Drive the adoption of Databricks Unity Catalog governance and performance features.
Partner with analytics teams to enable seamless model training and inference pipelines on Databricks.
Data Quality Observability & Governance
Define and implement frameworks for data validation monitoring and error handling.
Collaborate with platform teams to establish data lineage and governance using tools like Great Expectations Monte Carlo or Databricks-native observability.
Ensure compliance with Bains data security and privacy standards.
DevOps & CI/CD for Data
Implement CI/CD pipelines for data code deployments using Git Azure DevOps or Jenkins.
Automate testing deployment and monitoring for data workflows to ensure reliability and repeatability.
Cross-Functional Collaboration
Work with product and business teams to translate analytical requirements into scalable technical designs.
Collaborate with Data Science and BI teams to deliver analytics-ready datasets for dashboards and models.
Serve as a technical advisor in architectural reviews and strategic data initiatives within NPS Prism.
Required Skills and Qualifications
Core Technical Expertise:
Advanced proficiency in Databricks (mandatory).
Strong command of Python SQL and PySpark for big data processing.
Experience with Delta Lake Spark optimization and cluster management.
Hands-on with ETL/ELT design data lake and warehouse architecture.
Cloud expertise in Azure AWS or GCP (Azure preferred).
Leadership & Architecture:
68 years of data engineering experience with at least 3 years in a lead or staff-level role.
Proven ability to design end-to-end data solutions and influence engineering best practices.
Strong mentorship and stakeholder management skills.
Additional Desirable Skills:
Familiarity with streaming frameworks (Kafka Event Hubs).
Understanding of data modeling and BI integration (Power BI Tableau).
Exposure to DevOps CI/CD pipelines and Infrastructure as Code (IaC).
Strong problem-solving and analytical skills.
Educational Qualifications
Bachelors or Masters degree in Computer Science Information Systems or a related field.
Required Experience:
Staff IC
NPS Prism provides actionable voice-of-the-customer insights from your own and your competitors’ customers. Start making the improvements they care about most.