Job Title
Senior Data Architect Databricks
Location
-
Work Location: Offshore
-
Industry: IT Services
Mandatory Skills
-
Databricks
-
Python
-
PySpark
Job Summary
We are looking for a highly experienced Senior Data Architect with strong hands-on expertise in Databricks Python and PySpark. The ideal candidate will have a solid background in designing and implementing scalable data platforms strong problem-solving abilities and extensive experience working on complex data engineering solutions.
Roles and Responsibilities
-
Design architect and implement end-to-end data solutions using Databricks.
-
Lead the development of scalable high-performance data pipelines using PySpark and Python.
-
Architect and optimize data ingestion transformation and processing workflows.
-
Provide technical leadership and architectural guidance to data engineering teams.
-
Ensure best practices in data modeling performance tuning and cost optimization on Databricks.
-
Work closely with stakeholders to understand business requirements and translate them into technical solutions.
-
Troubleshoot and resolve complex data and performance issues across the platform.
-
Ensure data quality reliability security and compliance standards are met.
-
Review code mentor junior engineers and drive engineering excellence across the team.
-
Collaborate with cross-functional teams including product analytics and DevOps.
Required Qualifications
-
10 years of overall IT experience with 8 years of relevant experience in data engineering and architecture.
-
Strong hands-on experience with Databricks (mandatory).
-
4 5 years of strong programming experience in Python and PySpark (mandatory).
-
Proven experience in designing large-scale distributed data processing systems.
-
Excellent problem-solving and analytical skills.
-
Strong communication skills and ability to work in offshore delivery models.
Desired Skills (Good to Have)
-
Experience with cloud platforms (AWS / Azure / GCP).
-
Knowledge of data lakes data warehousing and big data ecosystems.
-
Exposure to CI/CD pipelines and DevOps practices for data platforms.
Job Title Senior Data Architect Databricks Location Work Location: Offshore Industry: IT Services Mandatory Skills Databricks Python PySpark Job Summary We are looking for a highly experienced Senior Data Architect with strong hands-on expertise in Databricks Python and PySpark. Th...
Job Title
Senior Data Architect Databricks
Location
-
Work Location: Offshore
-
Industry: IT Services
Mandatory Skills
-
Databricks
-
Python
-
PySpark
Job Summary
We are looking for a highly experienced Senior Data Architect with strong hands-on expertise in Databricks Python and PySpark. The ideal candidate will have a solid background in designing and implementing scalable data platforms strong problem-solving abilities and extensive experience working on complex data engineering solutions.
Roles and Responsibilities
-
Design architect and implement end-to-end data solutions using Databricks.
-
Lead the development of scalable high-performance data pipelines using PySpark and Python.
-
Architect and optimize data ingestion transformation and processing workflows.
-
Provide technical leadership and architectural guidance to data engineering teams.
-
Ensure best practices in data modeling performance tuning and cost optimization on Databricks.
-
Work closely with stakeholders to understand business requirements and translate them into technical solutions.
-
Troubleshoot and resolve complex data and performance issues across the platform.
-
Ensure data quality reliability security and compliance standards are met.
-
Review code mentor junior engineers and drive engineering excellence across the team.
-
Collaborate with cross-functional teams including product analytics and DevOps.
Required Qualifications
-
10 years of overall IT experience with 8 years of relevant experience in data engineering and architecture.
-
Strong hands-on experience with Databricks (mandatory).
-
4 5 years of strong programming experience in Python and PySpark (mandatory).
-
Proven experience in designing large-scale distributed data processing systems.
-
Excellent problem-solving and analytical skills.
-
Strong communication skills and ability to work in offshore delivery models.
Desired Skills (Good to Have)
-
Experience with cloud platforms (AWS / Azure / GCP).
-
Knowledge of data lakes data warehousing and big data ecosystems.
-
Exposure to CI/CD pipelines and DevOps practices for data platforms.
View more
View less