Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailNot Disclosed
Salary Not Disclosed
1 Vacancy
Data Engineer - Snowflake
Job ID: V6W84V8R
Location: Greater Kuala Lumpur Malaysia
Work Mode: Onsite
Job Type: Permanent
Our client is a global leader in sustainable building materials generating multi-billion annual revenue with operations in over 40 countries and a workforce exceeding 12000 employees. They are recognized for driving innovation in energy efficiency fire safety and circularity within the construction industry. With strong international presence and a focus on digital transformation they continue to lead in sustainable solutions that make cities healthier and safer.
We are seeking an experienced Senior Data Engineer to contribute to the development operation and optimization of robust data platforms and pipelines supporting AI/ML and analytics initiatives.
Responsibilities
Maintain and enhance scalable data pipelines (batch and streaming) for ingestion transformation and storage.
Write efficient code in Python and SQL for ETL/ELT workflows.
Monitor and troubleshoot data workflows ensuring high availability and SLA adherence.
Build and maintain data models data marts and data warehouses.
Implement data validation and quality checks; monitor lineage and metadata.
Work with data stewards and scientists to define and enforce governance standards.
Optimize SQL queries data transformations and infrastructure performance.
Automate infrastructure provisioning using IaC tools and implement CI/CD pipelines.
Recommend architecture improvements to increase throughput and reduce cost.
Collaborate across technical and business teams and handle operational support tickets.
Requirements
Must-have:
3 to 5 years of hands-on experience in data engineering.
Strong proficiency in SQL and at least one of: Python Scala Java or Snowpark.
Proven experience with Snowflake DBT and Azure data tools.
Familiarity with data lakes lakehouse and data warehouse architectures.
Experience in Infrastructure as Code (Terraform CloudFormation) and version control (Git/GitHub).
Solid understanding of CI/CD practices for data workloads.
Exposure to data quality cataloging and metadata tools.
Comfortable with operational tasks incident handling and platform reliability.
Strong problem-solving skills and attention to detail.
Clear communication with both technical teams and business stakeholders.
Degree in Computer Science Information Systems or related field.
Nice-to-have:
Familiarity with Azure Functions Data Lake and formats like Iceberg.
Experience with Jinja Snowpipe or LLM tools like GitHub Copilot or ChatGPT.
Hands-on experience with tools like Kafka Spark Airflow Databricks Informatica.
Exposure to observability tools (Prometheus Grafana ELK stack).
Experience in Scrum/Kanban using Jira.
International work exposure or experience in global teams.
Full Time