Position Overview
We are seeking experienced Databricks Developers and Data Modelers with a minimum of 5 years of hands-on experience working with Databricks on cloud platforms such as AWS and this role you will design develop and optimize scalable data solutions leveraging Databricks to enable advanced analytics and data-driven decision making.
Key Responsibilities
- Design develop and maintain scalable ETL pipelines and data models using Databricks on AWS and Azure.
- Collaborate with data architects analysts and business stakeholders to understand requirements and deliver high-quality solutions.
- Optimize data workflows for performance reliability and cost-efficiency on cloud platforms.
- Implement best practices for data governance security and compliance.
- Troubleshoot and resolve issues related to data ingestion transformation and storage.
- Document technical designs processes and procedures.
Must-Have Skills
- Databricks Expertise: Extensive experience developing in Databricks including notebooks jobs clusters and workspace management.
- Cloud Platforms: Proven experience deploying and managing Databricks workloads on AWS and Azure.
- Data Modeling: Strong skills in designing and implementing dimensional and relational data models.
- ETL Development: Advanced knowledge of building scalable ETL pipelines using Spark (PySpark Scala or SQL).
- Programming: Proficiency in Python and/or Scala for data engineering tasks.
- SQL: Expert-level SQL for querying transforming and analyzing large datasets.
- Performance Optimization: Experience tuning Spark jobs and optimizing data workflows for cloud environments.
- Data Security: Knowledge of data governance access control and compliance standards.
- Collaboration: Strong communication skills and proven ability to work in cross-functional teams.
Good-to-Have Skills
- ML & Advanced Analytics: Experience integrating Databricks with machine learning frameworks (MLlib TensorFlow etc.).
- DevOps: Familiarity with CI/CD pipelines and automation tools for data deployments.
- Visualization: Experience with BI tools such as Power BI Tableau or Databricks dashboards.
- Streaming Data: Knowledge of real-time data processing (Structured Streaming Kafka etc.).
- Delta Lake: Hands-on experience with Delta Lake for ACID transactions and scalable data lakes.
- Infrastructure as Code: Experience with Terraform ARM templates or AWS CloudFormation.
- Cloud Cost Management: Understanding of cloud cost optimization and monitoring tools.
- Certifications: Relevant certifications in Databricks AWS or Azure are a plus.
Qualifications
- Bachelors or Masters degree in Computer Science Information Technology Engineering or related field.
- Minimum 5 years of professional experience in Databricks development and data modeling on AWS and/or Azure.
- Excellent problem-solving and analytical skills.
Position Overview We are seeking experienced Databricks Developers and Data Modelers with a minimum of 5 years of hands-on experience working with Databricks on cloud platforms such as AWS and this role you will design develop and optimize scalable data solutions leveraging Databricks to enable adv...
Position Overview
We are seeking experienced Databricks Developers and Data Modelers with a minimum of 5 years of hands-on experience working with Databricks on cloud platforms such as AWS and this role you will design develop and optimize scalable data solutions leveraging Databricks to enable advanced analytics and data-driven decision making.
Key Responsibilities
- Design develop and maintain scalable ETL pipelines and data models using Databricks on AWS and Azure.
- Collaborate with data architects analysts and business stakeholders to understand requirements and deliver high-quality solutions.
- Optimize data workflows for performance reliability and cost-efficiency on cloud platforms.
- Implement best practices for data governance security and compliance.
- Troubleshoot and resolve issues related to data ingestion transformation and storage.
- Document technical designs processes and procedures.
Must-Have Skills
- Databricks Expertise: Extensive experience developing in Databricks including notebooks jobs clusters and workspace management.
- Cloud Platforms: Proven experience deploying and managing Databricks workloads on AWS and Azure.
- Data Modeling: Strong skills in designing and implementing dimensional and relational data models.
- ETL Development: Advanced knowledge of building scalable ETL pipelines using Spark (PySpark Scala or SQL).
- Programming: Proficiency in Python and/or Scala for data engineering tasks.
- SQL: Expert-level SQL for querying transforming and analyzing large datasets.
- Performance Optimization: Experience tuning Spark jobs and optimizing data workflows for cloud environments.
- Data Security: Knowledge of data governance access control and compliance standards.
- Collaboration: Strong communication skills and proven ability to work in cross-functional teams.
Good-to-Have Skills
- ML & Advanced Analytics: Experience integrating Databricks with machine learning frameworks (MLlib TensorFlow etc.).
- DevOps: Familiarity with CI/CD pipelines and automation tools for data deployments.
- Visualization: Experience with BI tools such as Power BI Tableau or Databricks dashboards.
- Streaming Data: Knowledge of real-time data processing (Structured Streaming Kafka etc.).
- Delta Lake: Hands-on experience with Delta Lake for ACID transactions and scalable data lakes.
- Infrastructure as Code: Experience with Terraform ARM templates or AWS CloudFormation.
- Cloud Cost Management: Understanding of cloud cost optimization and monitoring tools.
- Certifications: Relevant certifications in Databricks AWS or Azure are a plus.
Qualifications
- Bachelors or Masters degree in Computer Science Information Technology Engineering or related field.
- Minimum 5 years of professional experience in Databricks development and data modeling on AWS and/or Azure.
- Excellent problem-solving and analytical skills.
View more
View less