Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailNot Disclosed
Salary Not Disclosed
1 Vacancy
As a Data Engineer you will directly report to the Head of Data and Analytics and will be responsible in design and development of data modelling data pipelines supporting the companys platform and apps as well as providing data and reporting support for business end users.
What youll be working on:
Design and development of reporting data model and data transformation jobs including the modelling of very large data sets.
Identify and implement the most efficient ways of performing data transformation tasks using best practice methods and tooling.
Prepare and maintain documentation such as business requirements documents design specifications and test cases.
Work with stakeholders (including data team software engineers and product team) to understand business requirements and translate these into technical specifications.
Lead the data migration and modelling process from GCP to data warehouse.
Responsible for data warehouse administration user access and security.
Contribute to the design and implementation of our data model and ETL framework.
What were looking for:
Minimum 2 years of experience in a data engineering environment with hands on experience building and maintaining complex data environments in the cloud (preferably GCP BigQuery and/or Snowflake).
Extensive experience with SQL (Postgres preferred) with a core focus on analyzing and validating complex and disparate data sets to find gaps between datasets requirements and source systems.
Demonstrate understanding and experience with following data engineering competencies:
Data warehousing principles including data architecture modelling database design and performance optimization best practices.
Building group data assets and pipelines from scratch by integrating large quantities of data from disparate internal and external sources.
Supporting analytics solutions to be productionised including deployment automation orchestration monitoring and logging. Preferably with an ETL tool such as Matillion DBT or equivalent.
Experience in deploying cloud infrastructure as code (IaC) using Terraform or similar.
Experience using Python to develop scripts and small programs for job orchestration and/or data manipulation.
Ability to interact with business end user to draw and distil business requirement into data pipeline design and reporting solution.
Ability to prioritize on the fly and work in a highperforming outcomes focused environment with multiple competing and ambiguous deliverables.
Working in an Agile development environment
Location: Makati or Iloilo
Work Arrangement: Hybrid/Remote Dayshift
Full Time