About the Role
We are looking for a skilled Data Engineer with strong expertise in Google Cloud Platform (GCP). In this role you will play a key part in building optimizing and maintaining data solutions that support advanced analytics machine learning and data-driven decision-making. Fluency in Cantonese is a strong advantage and will help you collaborate more effectively with local stakeholders.
Key Responsibilities
Collaborate with Data Analysts and Data Scientists to understand requirements and design efficient data flows pipelines and interactive reports.
Work closely with stakeholders to understand how data is used across different teams and propose improvements.
Design and implement cloud-based architecture and deployment processes on GCP.
Build and maintain data pipelines transformations and metadata to support business needs.
Create solutions for Relational and Dimensional data models aligned with platform capabilities.
Develop test and optimize big data solutions to ensure scalability and performance.
Monitor and maintain the production environment ensuring data quality reliability and integrity.
Lead initiatives to improve data quality governance security and compliance.
Requirements
Proven experience as a Data Engineer with expertise in GCP services (e.g. BigQuery Dataflow Pub/Sub Dataproc Cloud Storage).
Strong knowledge of data architecture modeling and ETL/ELT processes.
Hands-on experience with big data frameworks and modern data tools.
Fluency in Cantonese (spoken and written) is a strong advantage and will help you collaborate effectively with local stakeholders.
Strong communication and collaboration skills.
Familiarity with machine learning AI and advanced analytics is a plus.
Qualifications :
5 years of experience in Data Engineering.
Strong proficiency in Python and Apache Spark.
Hands-on experience designing and implementing ETL/ELT processes and data pipelines.
Solid expertise in SQL scripting and query optimization.
Experience with Snowflake or other modern cloud data platforms.
Background in cloud data technologies and tools with exposure to:
Data processing frameworks (Spark Hadoop Apache Beam Dataproc or similar)
Data warehouses (BigQuery Redshift or equivalent)
Real-time streaming pipelines (Kinesis Kafka or similar)
Batch and serverless data processing
Strong analytical skills with the ability to work with both structured and unstructured data.
Experience in leading IT projects and managing stakeholder expectations.
Additional Information :
Discover some of the global benefits that empower our people to become the best version of themselves:
- Finance: Competitive salary package share plan company performance bonuses value-based recognition awards referral bonus;
- Career Development: Career coaching global career opportunities non-linear career paths internal development programmes for management and technical leadership;
- Learning Opportunities: Complex projects rotations internal tech communities training certifications coaching online learning platforms subscriptions pass-it-on sessions workshops conferences;
- Work-Life Balance: Hybrid work and flexible working hours employee assistance programme;
- Health: Global internal wellbeing programme access to wellbeing apps;
- Community: Global internal tech communities hobby clubs and interest groups inclusion and diversity programmes events and celebrations.
At Endava were committed to creating an open inclusive and respectful environment where everyone feels safe valued and empowered to be their best. We welcome applications from people of all backgrounds experiences and perspectivesbecause we know that inclusive teams help us deliver smarter more innovative solutions for our customers. Hiring decisions are based on merit skills qualifications and potential. If you need adjustments or support during the recruitment process please let us know.
Remote Work :
No
Employment Type :
Full-time
About the RoleWe are looking for a skilled Data Engineer with strong expertise in Google Cloud Platform (GCP). In this role you will play a key part in building optimizing and maintaining data solutions that support advanced analytics machine learning and data-driven decision-making. Fluency in Cant...
About the Role
We are looking for a skilled Data Engineer with strong expertise in Google Cloud Platform (GCP). In this role you will play a key part in building optimizing and maintaining data solutions that support advanced analytics machine learning and data-driven decision-making. Fluency in Cantonese is a strong advantage and will help you collaborate more effectively with local stakeholders.
Key Responsibilities
Collaborate with Data Analysts and Data Scientists to understand requirements and design efficient data flows pipelines and interactive reports.
Work closely with stakeholders to understand how data is used across different teams and propose improvements.
Design and implement cloud-based architecture and deployment processes on GCP.
Build and maintain data pipelines transformations and metadata to support business needs.
Create solutions for Relational and Dimensional data models aligned with platform capabilities.
Develop test and optimize big data solutions to ensure scalability and performance.
Monitor and maintain the production environment ensuring data quality reliability and integrity.
Lead initiatives to improve data quality governance security and compliance.
Requirements
Proven experience as a Data Engineer with expertise in GCP services (e.g. BigQuery Dataflow Pub/Sub Dataproc Cloud Storage).
Strong knowledge of data architecture modeling and ETL/ELT processes.
Hands-on experience with big data frameworks and modern data tools.
Fluency in Cantonese (spoken and written) is a strong advantage and will help you collaborate effectively with local stakeholders.
Strong communication and collaboration skills.
Familiarity with machine learning AI and advanced analytics is a plus.
Qualifications :
5 years of experience in Data Engineering.
Strong proficiency in Python and Apache Spark.
Hands-on experience designing and implementing ETL/ELT processes and data pipelines.
Solid expertise in SQL scripting and query optimization.
Experience with Snowflake or other modern cloud data platforms.
Background in cloud data technologies and tools with exposure to:
Data processing frameworks (Spark Hadoop Apache Beam Dataproc or similar)
Data warehouses (BigQuery Redshift or equivalent)
Real-time streaming pipelines (Kinesis Kafka or similar)
Batch and serverless data processing
Strong analytical skills with the ability to work with both structured and unstructured data.
Experience in leading IT projects and managing stakeholder expectations.
Additional Information :
Discover some of the global benefits that empower our people to become the best version of themselves:
- Finance: Competitive salary package share plan company performance bonuses value-based recognition awards referral bonus;
- Career Development: Career coaching global career opportunities non-linear career paths internal development programmes for management and technical leadership;
- Learning Opportunities: Complex projects rotations internal tech communities training certifications coaching online learning platforms subscriptions pass-it-on sessions workshops conferences;
- Work-Life Balance: Hybrid work and flexible working hours employee assistance programme;
- Health: Global internal wellbeing programme access to wellbeing apps;
- Community: Global internal tech communities hobby clubs and interest groups inclusion and diversity programmes events and celebrations.
At Endava were committed to creating an open inclusive and respectful environment where everyone feels safe valued and empowered to be their best. We welcome applications from people of all backgrounds experiences and perspectivesbecause we know that inclusive teams help us deliver smarter more innovative solutions for our customers. Hiring decisions are based on merit skills qualifications and potential. If you need adjustments or support during the recruitment process please let us know.
Remote Work :
No
Employment Type :
Full-time
View more
View less