Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via email$ 186000 - 211000
1 Vacancy
This position requires the selected candidate to be on-site 2-3 days a week.
The Director Data Integration and Interoperability will lead enterprise-wide data ingestion transformation and delivery initiatives in a Databricks centered environment. This role is critical in designing and operating robust data integration pipelines across internal and external sources enabling access via APIs and lake federation and other mechanisms ensuring that data is FAIR (Findable Accessible Interoperable Reusable).
This position will design and oversee ELT and Lake Federation processes as well as interoperability with MDM data governance DevOps MLOps and AIOps addition it will be responsible for enabling dimensional data modeling to support scalable analytics and business intelligence solutions. This position requires both technical leadership and strong collaboration with engineering data governance analytics and business teams.
Duties and Responsibilities:
Data Ingestion and Integration
Design and oversee scalable ingestion pipelines using Databricks pipelines Lakeflow Declarative Pipelines Delta Lake from diverse internal and external data sources.
Utilize tools like Databricks Auto Loader Python Kafka REST APIs SQL and cloud-native connectors for real-time and batch data flows.
Establish standardized ingestion and orchestration patterns based on data provider contracts ensuring SLAs quality observability and operational reliability.
Data Delivery and Federation
Ensure delivery of data to consumer apps and systems maintaining SLAs specified in data contracts.
Implement lake federation strategies to unify access across cloud platforms and on-premises systems.
Lead implementation and operation of API Store managing delivery of curated and governed data sets via REST APIs SQL endpoints and federated access layers with fine grained access control.
Ensure scalable secure and performant data access for downstream analytics reporting and machine learning.
Interoperability with Enterprise Data Platform components
Implement data interoperability between the data Lakehouse and the Master Data Management (MDM) system (i.e. Profisee) in support of federated data stewardship and quality processes.
Collaborate with data governance and security teams to ensure proper metadata management lineage tracking and compliance with data access archival and regulatory policies and regulations (e.g. FERPA GDPR PCA etc.) managed in a Purview environment.
Enable interoperability with MLOps and AIOps platforms to streamline model deployment monitoring and lifecycle management.
Dimensional Modeling and Analytics Support
Guide data engineering teams in designing and implementing dimensional models to support all enterprise reporting and analytics needs including AI/BI BI reverse ETL lake federation etc.
Ensure data models align with business definitions support high-performance queries and integrate cleanly with semantic layers data quality and data consumer tools and processes.
Partner with analytics teams to identify modeling needs and implement scalable reusable data structures.
Metadata Management and Governance
Lead the implementation and automation of metadata management practices across all data pipelines and assets.
Collaborate on development of metadata stores in Unity Catalog to ensure consistent documentation lineage tracking and impact analysis.
Ensure technical business and operational metadata are captured and maintained for all datasets and records.
Collaborate with data governance teams and stakeholders to enforce metadata standards naming conventions and classification policies within the enterprise data platform and across enterprise systems.
Support discovery reuse and transparency of data assets through well-governed metadata practices.
Lead implementation of observability tools and reports.
Leadership and Strategy
Build and lead a team of data integration engineers and architects; provide mentoring technical guidance and career development.
Define roadmaps and execution plans for data interoperability and integration capabilities.
Manage project delivery timelines budgets and cross-functional dependencies.
Engage with business and technical stakeholders across the institution as needed.
Skills:
Technical Skills
Expertise in Databricks Apache Spark Delta Lake SQL Python and Erwin Data Modeler.
Strong understanding of cloud data platforms preferably Azure and AWS.
Experience with dimensional modeling (Kimball Inmon and Linstedt data vault modeling a plus.
Proficiency in API development real-time streaming and batch data processing.
Expertise in Databricks with emphasis in Unity Catalog Lakeflow Asset Bundles Autoloader and Pipelines REST APIs and Lake Federation.
Working knowledge of MDM platforms data cataloging (e.g. Alation Collibra) data lineage and governance tools.
Integration experience with MLFlow.
Integration experience with NEO4J.
Strong leadership and team management capabilities.
Excellent verbal and written communication skills.
Strategic thinking with a bias toward execution.
Ability to manage stakeholder expectations and align technical execution with business objectives.
Education & Experience Requirements:
Education
Bachelors degree in Information Science Computer Science Data Engineering Information Systems or related technical discipline.
Experience:
10 years of experience in data engineering data integration and interoperability or related roles.
3 years in leadership positions leading data engineering and integration teams.
Demonstrated experience leading implementation and operations in Databricks environment with high hands-on involvement.
Proven track record of developing implementing dimensional modeling in data Lakehouse environments.
Experience integrating data platforms with enterprise MDM governance MLOps and AIOps tools.
Experience working in regulated high-volume multi-tenant environments.
All submissions should include a cover letter and resume.
The University of Maryland Global Campus (UMGC) is an equal opportunity employer and complies with all applicable federal and state laws regarding nondiscrimination. UMGC is committed to a policy of equal opportunity for all persons and does not discriminate on the basis of race color national origin age marital status sex sexual orientation gender identity gender expression disability religion ancestry political affiliation or veteran status in employment educational programs and activities and admissions.
Workplace Accommodations:
The University of Maryland Global Campus Global Campus (UMGC) is committed to creating and maintaining a welcoming and inclusive working environment for people of all abilities. UMGC is dedicated to the principle that no qualified individual with a disability shall based on disability be excluded from participation in or be denied the benefits of the services programs or activities of the University or be subjected to discrimination. For information about UMGCs Reasonable Workplace Accommodation Policy or to request an accommodation applicants/candidates can contact Employee Accommodations via email at.
Benefits Package Highlights:
Hiring Range:
$186000.00 - $211000.00Required Experience:
Director
Full-Time