Location: Toronto ON (Hybrid/ 4 Days onsite)
Job Type: Full Time Perm
Primary skillset: Azure Databricks Apache Spark
Secondary skillset: SQLPython Datawarehouse
Job Description:
Design scalable secure and high-performance cloud solutions on Microsoft Azure services
Design large-scale data processing and analytics solutions using Azure Databricks.
Develop comprehensive architecture blueprints and technical documentation for data solutions.
Lead the implementation of Databricks-based solutions ensuring alignment with best practices and business requirements.
Lead Data Mesh/Data products architecture building in Databricks-based solutions
Provide technical leadership and guidance throughout the project lifecycle.
Lead the implementation of Azure-based solutions ensuring alignment with best practices and business requirements.
Collaborate with stakeholders to gather requirements and translate them into technical solutions.
Design and implement CI/CD pipelines using Azure DevOps.
Ensure robust security measures are in place including identity management encryption and network security.
Optimize cloud resources for performance cost and reliability.
Stay updated with the latest Azure services features and industry trends.
Requirements:
- Bachelors degree in computer science Engineering or a related field.
- 10-15 years of experience in data engineering or a related field.
- Extensive experience with Databricks and Apache Spark.
- Proficiency in programming languages such as Python Strong knowledge of SQL and experience with relational databases.
- Experience with cloud platforms Especially in Azure
- Strong with data warehousing solutions (e.g. Delta Lake Lakehouse) Understanding of data governance and security best practices. Excellent problem-solving skills and attention to detail. Strong communication and teamwork skills