About Infinitive
Infinitive is a data & AI consultancy that enables global brands to deliver results through insights innovation and efficiency. We possess deep industry and technology expertise to drive and sustain adoption of new capabilities. We match our people and personalities to our clients culture while bringing the right mix of talent and skills to enable high return on investment.
Infinitive has been named Best Small Firms to Work For by Consulting Magazine eight times and has also been named a Washington Post Top Workplace Washington Business Journal Best Places to Work and Virginia Business Best Places to Work.
Role Overview
This architect will define and shape a unified platform service that enables scalable governed and cost-efficient data access across the bank. The ideal candidate will influence enterprise design standards and technical adoption by making Databricks Unity Catalog the effortless observable and default foundation for data integration governance and analytics across all business domains.
Key ResponsibilitiesPlatform Vision & Architecture- Define and champion the end-to-end architecture for the banks Databricks-based data platform ensuring scalability security cost efficiency and ease of adoption.
- Design a self-service platform layer that leverages Databricks Unity Catalog to deliver seamless data discovery access and observability across all environments.
- Establish architectural patterns and reference implementations that encourage enterprise-wide reuse and standardization.
Unity Catalog Strategy & Enablement- Lead the design and implementation ofDatabricks Unity Catalogas the central governance planedefining catalog hierarchies fine-grained access controls and cross-environment lineage.
- Evaluate and implement metadata RBAC/ABAC and data masking capabilities to meet regulatory and compliance requirements (e.g. GLBA GDPR HIPAA).
- Define the template architecture that allows Unity Catalog to operate as a scalable and cost-effective shared service across lines of business.
Scalability Cost and Observability- Engineer platform capabilities that provide deep visibility into compute storage and catalog operations through integratedobservability monitoring and FinOpspractices.
- Develop resource optimization strategies to balance performance and cost while maintaining compliance and SLAs.
- Establish metrics dashboards and alerts to ensure the platform scales predictably under enterprise workloads.
API and Integration Design- Architect streamlinedRESTful/GraphQL APIsfor secure governed data access and metadata integration.
- Ensure interoperability with enterprise systems APIs and external data consumers using modern consistent and documented integration patterns.
Data Modeling & Pipeline Strategy- Guide teams in buildingLakehouse-aligneddata models that maximize reuse and governance.
- Oversee design of ETL/ELT architectures (Spark PySpark SQL) that integrate seamlessly with Unity Catalog for lineage and access tracking.
Collaboration & Influence- Partner with engineering data science and risk teams to align platform design with business outcomes and regulatory expectations.
- Influence architecture steering committees and platform engineering groups to adopt the Databricks foundation as a managed enterprise-wide service.
- Promote a culture of easy adoption through clear design patterns documentation and working sessions.
Technical Leadership & Mentorship- Mentor engineers and architects on Databricks Unity Catalog and best practices for cost scale and observability.
- Contribute to internal architecture communities and upskill teams across multiple domains.
Required Skills & Qualifications- Education:Bachelors or Masters degree in Computer Science Engineering or related field.
- Experience:8 years in data architecture or platform engineering including experience designing enterprise-scale distributed data environments.
- Databricks Expertise:Deep hands-on knowledge ofDatabricks Delta Lake Apache Spark and Lakehouse principles.
- Unity Catalog Mastery:Demonstrated success architecting and operationalizingDatabricks Unity Catalogfor enterprise governance metadata management and access control.
- Programming & Data:Advanced proficiency inPython (PySpark)andSQL; experience with cloud data platforms (AWS Azure or GCP).
- API Engineering:Strong background inAPI architecture (REST GraphQL OpenAPI)and applying best-in-class security and observability.
- Governance Knowledge:Expert-level understanding ofdata governance frameworks data quality management and regulatory compliance.
- Soft Skills:Outstanding communication and influence skills with ability to advocate for design principles across executive technical and risk audiences.
Preferred Qualifications- Experience deploying Databricks and cloud infrastructure usingTerraform or IaC frameworks.
- Familiarity withMLflowand model governance integration.
- Relevant certifications (Databricks Certified Data EngineerAWS/Azure/GCP Architect).
- Experience withreal-time data streamingtechnologies (Kafka Structured Streaming).
Required Experience:
Senior IC
About InfinitiveInfinitive is a data & AI consultancy that enables global brands to deliver results through insights innovation and efficiency. We possess deep industry and technology expertise to drive and sustain adoption of new capabilities. We match our people and personalities to our clients cu...
About Infinitive
Infinitive is a data & AI consultancy that enables global brands to deliver results through insights innovation and efficiency. We possess deep industry and technology expertise to drive and sustain adoption of new capabilities. We match our people and personalities to our clients culture while bringing the right mix of talent and skills to enable high return on investment.
Infinitive has been named Best Small Firms to Work For by Consulting Magazine eight times and has also been named a Washington Post Top Workplace Washington Business Journal Best Places to Work and Virginia Business Best Places to Work.
Role Overview
This architect will define and shape a unified platform service that enables scalable governed and cost-efficient data access across the bank. The ideal candidate will influence enterprise design standards and technical adoption by making Databricks Unity Catalog the effortless observable and default foundation for data integration governance and analytics across all business domains.
Key ResponsibilitiesPlatform Vision & Architecture- Define and champion the end-to-end architecture for the banks Databricks-based data platform ensuring scalability security cost efficiency and ease of adoption.
- Design a self-service platform layer that leverages Databricks Unity Catalog to deliver seamless data discovery access and observability across all environments.
- Establish architectural patterns and reference implementations that encourage enterprise-wide reuse and standardization.
Unity Catalog Strategy & Enablement- Lead the design and implementation ofDatabricks Unity Catalogas the central governance planedefining catalog hierarchies fine-grained access controls and cross-environment lineage.
- Evaluate and implement metadata RBAC/ABAC and data masking capabilities to meet regulatory and compliance requirements (e.g. GLBA GDPR HIPAA).
- Define the template architecture that allows Unity Catalog to operate as a scalable and cost-effective shared service across lines of business.
Scalability Cost and Observability- Engineer platform capabilities that provide deep visibility into compute storage and catalog operations through integratedobservability monitoring and FinOpspractices.
- Develop resource optimization strategies to balance performance and cost while maintaining compliance and SLAs.
- Establish metrics dashboards and alerts to ensure the platform scales predictably under enterprise workloads.
API and Integration Design- Architect streamlinedRESTful/GraphQL APIsfor secure governed data access and metadata integration.
- Ensure interoperability with enterprise systems APIs and external data consumers using modern consistent and documented integration patterns.
Data Modeling & Pipeline Strategy- Guide teams in buildingLakehouse-aligneddata models that maximize reuse and governance.
- Oversee design of ETL/ELT architectures (Spark PySpark SQL) that integrate seamlessly with Unity Catalog for lineage and access tracking.
Collaboration & Influence- Partner with engineering data science and risk teams to align platform design with business outcomes and regulatory expectations.
- Influence architecture steering committees and platform engineering groups to adopt the Databricks foundation as a managed enterprise-wide service.
- Promote a culture of easy adoption through clear design patterns documentation and working sessions.
Technical Leadership & Mentorship- Mentor engineers and architects on Databricks Unity Catalog and best practices for cost scale and observability.
- Contribute to internal architecture communities and upskill teams across multiple domains.
Required Skills & Qualifications- Education:Bachelors or Masters degree in Computer Science Engineering or related field.
- Experience:8 years in data architecture or platform engineering including experience designing enterprise-scale distributed data environments.
- Databricks Expertise:Deep hands-on knowledge ofDatabricks Delta Lake Apache Spark and Lakehouse principles.
- Unity Catalog Mastery:Demonstrated success architecting and operationalizingDatabricks Unity Catalogfor enterprise governance metadata management and access control.
- Programming & Data:Advanced proficiency inPython (PySpark)andSQL; experience with cloud data platforms (AWS Azure or GCP).
- API Engineering:Strong background inAPI architecture (REST GraphQL OpenAPI)and applying best-in-class security and observability.
- Governance Knowledge:Expert-level understanding ofdata governance frameworks data quality management and regulatory compliance.
- Soft Skills:Outstanding communication and influence skills with ability to advocate for design principles across executive technical and risk audiences.
Preferred Qualifications- Experience deploying Databricks and cloud infrastructure usingTerraform or IaC frameworks.
- Familiarity withMLflowand model governance integration.
- Relevant certifications (Databricks Certified Data EngineerAWS/Azure/GCP Architect).
- Experience withreal-time data streamingtechnologies (Kafka Structured Streaming).
Required Experience:
Senior IC
View more
View less