Overview
We are seeking a skilled Databricks Architect to lead the design implementation and optimization of our data analytics platform using Databricks. The ideal candidate will have deep expertise in data engineering cloud infrastructure and big data technologies with a strong focus on delivering scalable secure and high-performance data solutions.
Key Responsibilities
-
Design and implement scalable and efficient data pipelines and infrastructure on Databricks.
-
Architect end-to-end data solutions integrating Databricks with cloud platforms (AWS Azure GCP).
-
Collaborate with data scientists data engineers and business stakeholders to understand data requirements and translate them into technical solutions.
-
Optimize and tune Databricks clusters to improve performance and reduce costs.
-
Ensure data security governance and compliance within the Databricks environment.
-
Develop best practices standards and documentation for Databricks architecture and usage.
-
Troubleshoot and resolve technical issues related to Databricks platform and integrations.
Qualifications
-
Proven experience architecting large-scale data solutions using Databricks.
-
Strong knowledge of Apache Spark and its ecosystem.
-
Proficiency in programming languages such as Python Scala or SQL.
-
Experience with cloud platforms such as AWS Azure or Google Cloud Platform.
-
Familiarity with data warehousing ETL processes and data modeling.
-
Strong understanding of data security compliance and governance practices.
Overview We are seeking a skilled Databricks Architect to lead the design implementation and optimization of our data analytics platform using Databricks. The ideal candidate will have deep expertise in data engineering cloud infrastructure and big data technologies with a strong focus on delivering...
Overview
We are seeking a skilled Databricks Architect to lead the design implementation and optimization of our data analytics platform using Databricks. The ideal candidate will have deep expertise in data engineering cloud infrastructure and big data technologies with a strong focus on delivering scalable secure and high-performance data solutions.
Key Responsibilities
-
Design and implement scalable and efficient data pipelines and infrastructure on Databricks.
-
Architect end-to-end data solutions integrating Databricks with cloud platforms (AWS Azure GCP).
-
Collaborate with data scientists data engineers and business stakeholders to understand data requirements and translate them into technical solutions.
-
Optimize and tune Databricks clusters to improve performance and reduce costs.
-
Ensure data security governance and compliance within the Databricks environment.
-
Develop best practices standards and documentation for Databricks architecture and usage.
-
Troubleshoot and resolve technical issues related to Databricks platform and integrations.
Qualifications
-
Proven experience architecting large-scale data solutions using Databricks.
-
Strong knowledge of Apache Spark and its ecosystem.
-
Proficiency in programming languages such as Python Scala or SQL.
-
Experience with cloud platforms such as AWS Azure or Google Cloud Platform.
-
Familiarity with data warehousing ETL processes and data modeling.
-
Strong understanding of data security compliance and governance practices.
View more
View less