Senior Director, Data Engineering

Dentsu

Not Interested
Bookmark
Report This Job

profile Job Location:

Detroit, MI - USA

profile Monthly Salary: Not Disclosed
Posted on: Yesterday
Vacancies: 1 Vacancy

Job Summary

Job Description:

The Senior Director Data Engineer

Responsibilities

  • Build scale and maintain robust data pipelines/models using DBT Python PySpark Databricks and SQL integrating AI-first foundations and semantic layers for consistent data interpretation.

  • Design and manage semantic models star schemas ontologies taxonomies knowledge graphs and glossaries using DBT YAML GitHub Unity Catalog Fabric/OneLake and Power BI for unified understanding and AI reasoning.

  • Utilize low-code/no-code tools (Trifacta DBT Power BI Tableau Fabric/OneLake Copilot Studio) to build governed semantic layers supporting natural language querying vector search and hybrid AI indexing.

  • Own AI deployment pipelines with containerized agents and automation using Kubernetes n8n LangChain Azure AI Foundry and Copilot Platform (MCP) for multi-step retrieval summarization and notifications.

  • Strengthen AI accuracy/governance via metadata access controls and grounding (vector DBs search indexes knowledge graphs) to deliver reliable responses source citation and why reasoning.

  • Design modular reusable data models for analytics reporting AI enablement and agentic apps including LLM integration for intent parsing routing retrieval and synthesis.

  • Develop and monitor mapping tables validation rules lineage error logging and observability for ETL/ELT health data integrity schema control and real-time quality monitoring.

  • Collaborate with analysts engineers and stakeholders to transform raw data into governed datasets leveraging Adverity for multi-source integration and normalization.

  • Implement agentic AI and Copilot integrations to enhance data accessibility autonomous resolution and dynamic insights across processes.

  • Drive innovation in Data Quality Suite roadmap including real-time monitoring dynamic interfaces self-serve tools and AI-enhanced features for scalability.

  • Contribute to medallion architecture (bronze/silver/gold) best practices for reusable components semantic layer extension (e.g. RAG indexing) and AI infrastructure.

  • Manage Databricks Unity Catalog Workflows SQL Analytics Notebooks and Jobs for governed analytics and ML workflows.

  • Develop pipelines/tools with Microsoft Fabric Power BI Power Apps Azure Data Lake/Blob and Copilot Studio tied to GitHub n8n and Kubernetes orchestration.

  • Leverage GitHub and GitHub Copilot for version control CI/CD automation code suggestions and collaboration on SQL Python YAML and agent development.

  • Utilize Java or Scala for custom processing scripts scalable ingestion and advanced AI actions like code execution and vector search.

Qualifications

  • 8 years of experience as a Data Engineer or in a similar role building scalable data infrastructure with at least 2 years focused on AI-integrated systems semantic layers or agentic AI deployments.

  • Bachelors Degree in Computer Science Engineering Information Systems or related field required.

  • Advanced expertise in SQL Python DBT; strong experience with PySpark Databricks and semantic layer tools like DBT YAML Unity Catalog and knowledge graphs required.

  • Hands-on experience with ETL/ELT design tools like Trifacta (Alteryx) Adverity Azure Data Factory Fabric/Power BI DAX or similar including data normalization and workflow automation.

  • Proven experience building and extending semantic layers for AI applications including ontologies taxonomies vector databases and integration with LLMs for enhanced reasoning accuracy and why question resolution.

  • Deep experience in the Microsoft Tech Data Stack including Power BI Power Apps Fabric/OneLake Azure Data Lakes (ADLS Gen2) Azure Blob Storage Copilot Studio and Azure AI Foundry for ModelOps and intelligent actions.

  • Experience with AI deployment and orchestration tools such as Kubernetes n8n LangChain and Model Context Protocol (MCP) for containerized agents multi-step workflows and governance.

  • Strong experience in developing and managing API endpoints integrating with external systems and supporting LLM access for conversational AI and automation.

  • Proficiency in Java or Scala for large-scale data processing ingestion workflows and custom AI integrations.

  • Experience supporting data observability quality frameworks (e.g. unit tests reconciliation logic job monitoring) and AI governance (e.g. metadata embedding compliance rules).

  • Strong familiarity with Git-based development GitHub Copilot for AI-assisted coding and structured code collaboration in environments like DBT Cloud and GitHub Actions.

  • Act quickly and independently demonstrating a self-starter mindset with a proven ability to learn new tools and technologies on the fly while delivering scalable solutions using any combination of tools in our tech stack to drive continuous improvement and impact.

  • Bonus: Exposure to building tools in Microsoft Power Apps or other low-code platforms including Copilot integrations for monitoring and workflows.

  • Bonus: Experience in advertising marketing or digital media environments particularly with use cases like performance reporting reconciliation automation or brand visibility optimization.

The annual salary range for this position is $113000-$182850. Placement within the salary range is based on a variety of factors including relevant experience knowledge skills and other factors permitted by law.

Benefits available with this position include:

Medical vision and dental insurance

Life insurance

Short-term and long-term disability insurance

401k

Flexible paid time off

At least 15 paid holidays per year

Paid sick and safe leave and

Paid parental leave.

Dentsu also complies with applicable state and local laws regarding employee leave benefits including but not limited to providing time off pursuant to the Colorado Healthy Families and Workplaces Act in accordance with its plans and policies. For further details regarding Dentsu benefits please visit .

To begin the application process please click on the Apply button at the top of this job posting. Applications will be reviewed on an ongoing basis and qualified candidates will be contacted for next steps.

At dentsu we believe great work happens when were connected. Our way of working combines flexibility with in-person collaboration to spark ideas and strengthen our teams. Employees who live within a commutable distance of one of our hub offices currently located in Chicago metro Detroit Los Angeles and New York City are required and expected to work from the office three days per week (two days per week for employees based in Los Angeles). Dentsu may designate other Hub offices at any time. Those who live outside a commutable range may be designated as remote depending on the role and business needs. Regardless of your work location we expect our employees to be flexible to meet the needs of our Company and clients which may include attendance in an office.

#LI-AB2

#LI-Remote

Location:

New York

Brand:

Dentsu Media

Time Type:

Full time

Contract Type:

Permanent

Dentsu is committed to providing equal employment opportunities to all applicants and employees. We do this without regard to race color national origin sex sexual orientation gender identity age pregnancy childbirth or related medical conditions ancestry physical or mental disability marital status political affiliation religious practices and observances citizenship status genetic information veteran status or any other basis protected under applicable federal state or local law.

Dentsu is committed to providing reasonable accommodation to among others individuals with disabilities and disabled veterans. If you need an accommodation because of a disability to search and apply for a career opportunity with us please send an e-mail toby clicking on the link to let usknow the nature of your accommodation request and your contact information. We are here to support you.


Required Experience:

Exec

Job Description:The Senior Director Data Engineer ResponsibilitiesBuild scale and maintain robust data pipelines/models using DBT Python PySpark Databricks and SQL integrating AI-first foundations and semantic layers for consistent data interpretation.Design and manage semantic models star schemas o...
View more view more

Key Skills

  • Go
  • Lean
  • Management Experience
  • React
  • Node.js
  • Operations Management
  • Project Management
  • Research & Development
  • Software Development
  • Team Management
  • GraphQL
  • Leadership Experience