Databricks Specialist Azure Databricks Consultant
Job Summary
Databricks Specialist / Azure Databricks Consultant
Position Overview:
Arctiq is a leader in professional IT services and managed services across three core Centers of Excellence: Enterprise Security Modern Infrastructure and Platform Engineering. Renowned for our ability to architect intelligence we connect protect and transform organizations empowering them to thrive in todays digital landscape. Arctiq builds on decades of industry expertise and a customer-centric ethos to deliver exceptional value to clients across diverse industries.
We are seeking a highly skilled Databricks Specialist with deep expertise in building and deploying enterprise-grade data engineering solutions within Azure Databricks environments. This role will focus on modern data pipeline design CI/CD enablement declarative deployment frameworks and scalable lakehouse architecture.
The ideal candidate will have hands-on experience with Databricks Asset Bundles (Declarative Automation Bundles) declarative data pipelines Azure cloud services Terraform and modern data platform engineering best practices.
This resource will partner closely with platform analytics AI and application teams to build reliable scalable and production-ready data solutions.
Core Technical Skills:
Strong hands-on experience with Azure Databricks
Proven expertise with Databricks Asset Bundles / Declarative Automation Bundles
Experience implementing Declarative Data Pipelines
Strong knowledge of Delta Lake Lakehouse architecture and medallion design patterns
Experience building batch and streaming ETL/ELT pipelines
Proficiency in Python PySpark and SQL
Experience with Lakeflow / Delta Live Tables
Strong Azure cloud experience including:
o Azure Data Lake Storage
o Azure Key Vault
o Azure networking and security fundamentals
Experience with Terraform (Infrastructure as Code)
Exposure to cloud architecture and modern data platform design
Experience with CI/CD pipeline implementation for Databricks deployments
Working knowledge of Git-based version control workflows
Experience with Unity Catalog data governance and access controls
Strong understanding of job orchestration workflows and scheduling
Key Responsibilities:
Design develop and optimize scalable Azure Databricks data pipelines
Implement Declarative Automation Bundles / Asset Bundles for repeatable multi-environment deployments
Build production-grade Lakeflow / Delta Live Table pipelines
Develop reusable PySpark notebooks workflows and modular code packages
Create CI/CD deployment patterns using GitHub Actions
o Manage infrastructure deployment patterns using Terraform Databricks Asset Bundles
Implement best practices for:
o monitoring
o alerting
o testing
o observability
o rollback strategies
Optimize data workloads for:
o cost
o cluster efficiency
o job runtime
o performance
Support governance using Unity Catalog schema management and permissions
Partner with architecture teams on enterprise lakehouse modernization initiatives
Support AI/ML enablement teams with curated feature and training data pipelines
Nice to Have:
These additions make the JD stronger and help recruiting identify better talent:
Experience with data quality frameworks
Knowledge of FinOps and Databricks cost optimization
Exposure to cross-cloud Databricks (AWS/GCP)
Certification / Nice to Have:
Databricks Certified Data Engineer (Associate or better)
Ideal Background
4 years in data engineering or cloud data platform roles
2 years focused on Databricks implementations
Experience supporting enterprise data modernization initiatives
Strong understanding of DevOps and software engineering best practices for data platforms
Comfortable working in client-facing consulting or augmentation environments
Required Experience:
Senior IC
About Company
As a systems integrator and managed service provider, Arctiq provides Hybrid Cloud Infrastructure, Networking, Cybersecurity, Data and AI, Autonomous Operations, and ESM to deliver measurable outcomes.