Data Engineer

Not Interested
Bookmark
Report This Job

profile Job Location:

Jakarta - Indonesia

profile Monthly Salary: Not Disclosed
Posted on: 3 hours ago
Vacancies: 1 Vacancy

Job Summary

About the company

Geekhunter is hiring on behalf of our client a recognised leader in cloud and digital innovation.

Job Responsibilities:

  • Define and implement Medallion Architecture (Bronze Silver Gold zones) storage patterns and schema evolution standards.
  • Design build and maintain scalable ETL/ELT pipelines using orchestration tools such as Airflow Step Functions or dbt.
  • Develop connectors for multi-source ingestion from ERP systems APIs flat files and IoT streams.
  • Audit and optimize queries and transformation logic for performance and cost efficiency.
  • Implement governance measures including automated validation metadata cataloging lineage tracking and access controls.
  • Act as a technical consultant to business units translating requirements into high-performance data deliverables.
  • Provide structured datasets for BI dashboards (Power BI/Tableau) and enable downstream AI/ML pipelines.

Job Requirements:

  • A Bachelor’s or Master’s degree in Computer Science Information Systems or a related discipline.
  • Relevant cloud or platform certifications such as Databricks Certified Data Engineer Snowflake SnowPro Core or AWS/Azure/GCP Data Engineer are highly valued.
  • Between 3–7 years of experience in data engineering with a focus on building and scaling cloud-native data platforms.
  • Strong expertise in at least one major cloud data stack (AWS Azure or GCP) and the ability to operate effectively in multi-cloud environments.
  • Proficiency in SQL and Python/PySpark.
  • Demonstrated experience integrating enterprise ERP data sources.
  • Solid knowledge of API development and integration (REST GraphQL Webhooks).
  • Strong understanding of data modelling techniques including 3NF Star Schema Dimensional modelling and Data Vault basics.
  • Ability to design platform-wide standards and lightweight governance frameworks (data quality checks cataloguing).

Optional Capabilities

  • Expertise in Databricks (Unity Catalog Delta Live Tables) or Snowflake (Snowpark Data Sharing Performance Tuning).
  • Familiarity with ecosystem tools such as dbt or BigQuery.
  • Experience with real-time data processing using Kafka Kinesis or Pub/Sub.

Soft Skills

  • A consultative mindset with a willingness to engage directly with customers to understand challenges and present solutions.
  • Commitment to in-office collaboration for effective teamwork.
  • Strong communication skills to explain complex AI and data concepts to both technical and non-technical audiences.
  • A proactive “builder” mindset with ownership of platform standards.
  • Ability to bridge technical logic with business needs.
  • Resilience in navigating ambiguity and proposing best-practice solutions.
  • Willingness to work onsite in the Jakarta area.

Benefits:

  • Competitive Salary
  • BPJSK & TK
  • Private Health Insurance
  • THR
  • Transportation Lumpsum
  • Quarterly Incentives

Required Skills:

Databricks Certified Data Engineer Snowflake SnowPro Core AWS/Azure/GCP Data Engineer data engineering

About the companyGeekhunter is hiring on behalf of our client a recognised leader in cloud and digital innovation. Job Responsibilities:Define and implement Medallion Architecture (Bronze Silver Gold zones) storage patterns and schema evolution standards.Design build and maintain scalable ETL/ELT pi...
View more view more