Azure Databricks Architect
Experience: 6-8 Years
Role Summary
We are seeking an experienced Azure Databricks Architect to design implement and optimize a scalable data analytics platform. The role requires deep expertise in data engineering big data processing and cloud architecture to deliver secure high-performance and cost-efficient data solutions.
Key Responsibilities
-
Design and implement scalable data pipelines and infrastructure using Databricks.
-
Architect end-to-end data solutions integrating Databricks with Microsoft Azure Amazon Web Services or Google Cloud Platform.
-
Collaborate with data scientists engineers and business stakeholders to translate requirements into technical solutions.
-
Optimize and tune Databricks clusters for performance scalability and cost efficiency.
-
Implement data security governance and compliance controls.
-
Define architecture standards best practices and documentation.
-
Troubleshoot and resolve platform and integration issues.
Required Qualifications
-
Proven experience architecting large-scale solutions using Databricks.
-
Strong expertise in Apache Spark and its ecosystem.
-
Proficiency in Python Scala or SQL.
-
Hands-on experience with cloud platforms such as Microsoft Azure Amazon Web Services or Google Cloud Platform.
-
Experience in data warehousing ETL pipelines and data modeling.
-
Strong understanding of data governance security and compliance.
Core Competencies
Data Architecture Big Data Processing Cloud Integration Performance Optimization Data Governance Cost Optimization Stakeholder Collaboration
Azure Databricks Architect Experience: 6-8 Years Role Summary We are seeking an experienced Azure Databricks Architect to design implement and optimize a scalable data analytics platform. The role requires deep expertise in data engineering big data processing and cloud architecture to deliver secur...
Azure Databricks Architect
Experience: 6-8 Years
Role Summary
We are seeking an experienced Azure Databricks Architect to design implement and optimize a scalable data analytics platform. The role requires deep expertise in data engineering big data processing and cloud architecture to deliver secure high-performance and cost-efficient data solutions.
Key Responsibilities
-
Design and implement scalable data pipelines and infrastructure using Databricks.
-
Architect end-to-end data solutions integrating Databricks with Microsoft Azure Amazon Web Services or Google Cloud Platform.
-
Collaborate with data scientists engineers and business stakeholders to translate requirements into technical solutions.
-
Optimize and tune Databricks clusters for performance scalability and cost efficiency.
-
Implement data security governance and compliance controls.
-
Define architecture standards best practices and documentation.
-
Troubleshoot and resolve platform and integration issues.
Required Qualifications
-
Proven experience architecting large-scale solutions using Databricks.
-
Strong expertise in Apache Spark and its ecosystem.
-
Proficiency in Python Scala or SQL.
-
Hands-on experience with cloud platforms such as Microsoft Azure Amazon Web Services or Google Cloud Platform.
-
Experience in data warehousing ETL pipelines and data modeling.
-
Strong understanding of data governance security and compliance.
Core Competencies
Data Architecture Big Data Processing Cloud Integration Performance Optimization Data Governance Cost Optimization Stakeholder Collaboration
View more
View less