Databricks & GCP Data Platform Architect
Job Summary
We are looking for a hands-on Databricks & GCP Data Platform Architect who will design and personally implement scalable Lakehouse solutions on Google Cloud Platform (GCP).
This role requires deep technical involvement including building pipelines configuring Databricks and troubleshooting production issues in addition to architecture ownership.
Key Responsibilities
1. Architecture & Hands-on Implementation
- Design end-to-end Databricks Lakehouse architecture on GCP
- Hands-on implementation of:
- Databricks workspaces clusters jobs and workflows
- Delta Lakebased Bronze / Silver / Gold data layers
- Batch and streaming pipelines using Spark and Databricks
- Create reference implementations and reusable frameworks for teams
- Actively participate in coding reviews and production deployments
2. Data Engineering (Hands-on)
- Build and optimize Spark jobs and Databricks notebooks
- Implement ingestion pipelines from:
- Databases and enterprise applications
- Streaming sources (Pub/Sub Kafka)
- External and SaaS systems
- Perform performance tuning and cost optimization
- Troubleshoot pipeline failures and production issues directly
3. Security Governance & Compliance
- Implement (not just define) governance using Unity Catalog
- Configure access control integrated with GCP IAM
- Set up secure networking (VPC private endpoints)
- Enable audit logging lineage and data classification
- Work closely with security teams to operationalize standards
4. DevOps Automation & Operations (Hands-on)
- Build CI/CD pipelines for Databricks notebooks jobs and configs
- Implement Infrastructure as Code using Terraform
- Set up monitoring alerting and operational dashboards
- Participate in production support root-cause analysis and fixes
- Drive hands-on cost optimization initiatives
5. Stakeholder Collaboration
- Translate business requirements into implemented solutions
- Guide and mentor data engineers through code-level support
- Conduct architecture and code reviews
- Act as a technical owner from design through production
Required Skills & Experience
Must Have
- Strong hands-on experience with Databricks (Apache Spark)
- Proven experience building and deploying Lakehouse architectures
- Hands-on experience with GCP including:
- Google Cloud Storage (GCS)
- BigQuery
- Pub/Sub
- IAM & VPC basics
- Experience implementing batch and streaming pipelines
- Strong troubleshooting and production support skills
Good to Have
- Unity Catalog Delta Live Tables
- CI/CD Git Terraform
- MLflow Vertex AI exposure
- Multi-cloud Databricks experience (Azure / AWS)
Qualifications :
- 812 years of experience in data engineering / data platforms
- 3 years in a hands-on architect or senior technical lead role
Additional Information :
All your information will be kept confidential according to EEO guidelines.
Remote Work :
No
Employment Type :
Full-time
About Company
Sutherland is seeking an organized and reliable person to join us as Admin Specialist. We are a group of driven and supportive individuals. If you are looking to build a fulfilling career and are confident you have the skills and experience to help us succeed, we want to work with you ... View more