Azure Data Solutions Architect
Job Summary
Job Title: Technical Architect / Azure Data Solutions Architect
Location: Petaling Jaya Malaysia (Hybrid)
Role Type: Permanent
Responsibilities:
Lead the architecture design and implementation of advanced analytics solutions using Azure Databricks and Microsoft Fabric.
Develop scalable data solutions with a strong focus on big data technologies data engineering cloud computing and SQL.
Collaborate closely with business stakeholders and IT teams to gather requirements and deliver effective solutions.
Oversee end-to-end implementation of data solutions ensuring alignment with business needs and best practices.
Lead the development of data pipelines and ETL processes using Azure Databricks PySpark and related tools.
Integrate Azure Databricks with Azure services (Azure Data Lake Azure Synapse Azure Data Factory) and on-premise systems.
Provide technical leadership and mentor data engineering teams fostering continuous learning and improvement.
Ensure proper documentation of architecture processes and data flows adhering to security and governance standards.
Enforce best practices in code quality data security scalability and performance.
Stay up to date with the latest developments in Databricks and related technologies to drive innovation.
Essential Skills:
Strong experience with Azure Databricks including cluster management notebook development and Delta Lake.
Proficiency in big data technologies (e.g. Hadoop Spark) and data processing frameworks (e.g. PySpark).
Deep understanding of Azure services such as Azure Data Lake Azure Synapse and Azure Data Factory.
Experience with ETL/ELT processes data warehousing and building data lakes.
Strong SQL skills and familiarity with NoSQL databases.
Experience with CI/CD pipelines and version control systems (e.g. Git).
Knowledge of cloud security best practices.
Soft Skills:
Excellent communication skills with the ability to explain complex technical concepts to non-technical stakeholders.
Strong problem-solving abilities with a proactive approach to issue resolution.
Proven leadership skills with the ability to manage and mentor data engineering teams.
Nice-to-Have Skills:
Experience with Power BI for dashboarding and reporting.
Familiarity with Microsoft Fabric for analytics and integration.
Experience with Spark Streaming for real-time data processing.
Knowledge of Azure Resource Manager (ARM) templates and Infrastructure as Code (IaC) practices.
Experience:
12 years of experience in developing data ingestion and transformation pipelines using Databricks Synapse notebooks and Azure Data Factory.
Hands-on experience with Delta tables Delta Lake and Azure Data Lake Storage Gen2.
Experience using Auto Loader and Delta Live Tables for efficient data ingestion and transformation.
Proficiency in building and optimizing query layers using Databricks SQL.
Experience integrating Databricks with Azure Synapse ADLS Gen2 and Power BI for end-to-end analytics solutions.
Proven experience in developing optimizing and deploying Power BI reports.
Familiarity with modern CI/CD practices in Databricks and cloud-native environments.
Recruitment Partner:
Sperton: This position is exclusively managed by Sperton a global talent partner connecting high-performing professionals with leading organizations worldwide.
Key Skills
- Fund Management
- Drafting
- End User Support
- Infrastructure
- Airlines
- Catia