Databricks Developer
Job Summary
Key Responsibilities
- Design develop and optimize data pipelines using Databricks
- Build scalable ETL/ELT solutions for large datasets
- Work closely with data engineers architects and stakeholders
- Ensure data quality performance and reliability
- Take ownership of deliverables end-to-end and drive solutions independently
- Participate in design discussions and technical decision-making
Required Skills & Experience
- Total Experience: Minimum 8 years
- Databricks: At least 5 years of hands-on experience (mandatory)
- Strong experience as a Data Engineer or Data Warehouse professional
- Proficiency in Apache Spark SQL Python / Scala
- Experience working with large-scale data systems
- Strong understanding of data modeling performance tuning and optimization
- Exposure to cloud platforms (AWS / Azure preferred)
Behavioral & Professional Expectations
Good Communication Skills:
Ability to clearly communicate with both technical and non-technical stakeholdersCareer Stability:
Demonstrates long-term commitment and consistent career progressionOwnership:
Takes full responsibility for assigned modules and delivers with accountability
***
Concord is an execution partner helping organizations drive digital transformation modernization and scalable technology solutions. We deliver results that solve real business challenges. We operate globally and are growing fast shaping the future of technology. Join a team trusted by top companies to drive strategic growth and operational excellence!
Required Experience:
Senior IC
About Company
Concord is a technology consultancy blending style with substance to create flawless customer experiences backed by powerful analytics and underwritten by strong data foundations. With the refinement of an agency, the grit of a startup, and the experience of an institution, we create ... View more