About the Role
Were looking for experienced data engineers both senior and lead level with strong expertise in cloud-based data solutions ideally within Google Cloud Platform (GCP).
In this role youll be responsible for building optimizing and maintaining scalable data pipelines and architectures that support analytics machine learning and data-driven decision-making across the organization.
While GCP experience is preferred we also welcome candidates with AWS or Azure backgrounds who are keen to expand their hands-on experience with GCP.
Fluency in Cantonese is a nice-to-have and will help in collaborating effectively with local stakeholders but it is not a requirement.
Key Responsibilities
Collaborate with data analysts and data scientists to design efficient data flows pipelines and reporting solutions.
Work closely with business stakeholders to understand data usage and identify areas for improvement.
Design and implement cloud-based architecture and deployment processes on GCP (or other cloud platforms).
Build and maintain data pipelines transformations and metadata to support business needs.
Develop test and optimize big data solutions for scalability performance and reliability.
Create and manage relational and dimensional data models aligned with platform capabilities.
Monitor and maintain the production environment ensuring data quality integrity and governance.
Lead or contribute to initiatives improving data quality governance security and compliance.
Requirements
Proven experience as a Data Engineer ideally with GCP services (e.g. BigQuery Dataflow Pub/Sub Dataproc Cloud Storage).
Experience with AWS or Azure is highly valued especially for those eager to become more hands-on with GCP.
Strong understanding of data architecture modeling and ETL/ELT processes.
Hands-on experience with big data frameworks and modern data tools.
Excellent communication and collaboration skills across technical and business teams.
Familiarity with machine learning AI or advanced analytics is a plus.
Cantonese proficiency is a nice-to-have but not mandatory.
Qualifications :
5 years of experience in data engineering.
Strong proficiency in Python and Apache Spark.
Hands-on experience designing and implementing ETL/ELT processes and data pipelines.
Solid expertise in SQL scripting and query optimization.
Experience with Snowflake or other modern cloud data platforms.
Background in cloud data technologies and tools with exposure to:
Data processing frameworks (Spark Hadoop Apache Beam Dataproc or similar)
Data warehouses (BigQuery Redshift or equivalent)
Real-time streaming pipelines (Kinesis Kafka or similar)
Batch and serverless data processing
Strong analytical skills with the ability to work with both structured and unstructured data.
Experience in leading IT projects and managing stakeholder expectations.
Additional Information :
Discover some of the global benefits that empower our people to become the best version of themselves:
- Finance: Competitive salary package share plan company performance bonuses value-based recognition awards referral bonus;
- Career Development: Career coaching global career opportunities non-linear career paths internal development programmes for management and technical leadership;
- Learning Opportunities: Complex projects rotations internal tech communities training certifications coaching online learning platforms subscriptions pass-it-on sessions workshops conferences;
- Work-Life Balance: Hybrid work and flexible working hours employee assistance programme;
- Health: Global internal wellbeing programme access to wellbeing apps;
- Community: Global internal tech communities hobby clubs and interest groups inclusion and diversity programmes events and celebrations.
At Endava were committed to creating an open inclusive and respectful environment where everyone feels safe valued and empowered to be their best. We welcome applications from people of all backgrounds experiences and perspectivesbecause we know that inclusive teams help us deliver smarter more innovative solutions for our customers. Hiring decisions are based on merit skills qualifications and potential. If you need adjustments or support during the recruitment process please let us know.
Remote Work :
No
Employment Type :
Full-time
About the RoleWere looking for experienced data engineers both senior and lead level with strong expertise in cloud-based data solutions ideally within Google Cloud Platform (GCP).In this role youll be responsible for building optimizing and maintaining scalable data pipelines and architectures that...
About the Role
Were looking for experienced data engineers both senior and lead level with strong expertise in cloud-based data solutions ideally within Google Cloud Platform (GCP).
In this role youll be responsible for building optimizing and maintaining scalable data pipelines and architectures that support analytics machine learning and data-driven decision-making across the organization.
While GCP experience is preferred we also welcome candidates with AWS or Azure backgrounds who are keen to expand their hands-on experience with GCP.
Fluency in Cantonese is a nice-to-have and will help in collaborating effectively with local stakeholders but it is not a requirement.
Key Responsibilities
Collaborate with data analysts and data scientists to design efficient data flows pipelines and reporting solutions.
Work closely with business stakeholders to understand data usage and identify areas for improvement.
Design and implement cloud-based architecture and deployment processes on GCP (or other cloud platforms).
Build and maintain data pipelines transformations and metadata to support business needs.
Develop test and optimize big data solutions for scalability performance and reliability.
Create and manage relational and dimensional data models aligned with platform capabilities.
Monitor and maintain the production environment ensuring data quality integrity and governance.
Lead or contribute to initiatives improving data quality governance security and compliance.
Requirements
Proven experience as a Data Engineer ideally with GCP services (e.g. BigQuery Dataflow Pub/Sub Dataproc Cloud Storage).
Experience with AWS or Azure is highly valued especially for those eager to become more hands-on with GCP.
Strong understanding of data architecture modeling and ETL/ELT processes.
Hands-on experience with big data frameworks and modern data tools.
Excellent communication and collaboration skills across technical and business teams.
Familiarity with machine learning AI or advanced analytics is a plus.
Cantonese proficiency is a nice-to-have but not mandatory.
Qualifications :
5 years of experience in data engineering.
Strong proficiency in Python and Apache Spark.
Hands-on experience designing and implementing ETL/ELT processes and data pipelines.
Solid expertise in SQL scripting and query optimization.
Experience with Snowflake or other modern cloud data platforms.
Background in cloud data technologies and tools with exposure to:
Data processing frameworks (Spark Hadoop Apache Beam Dataproc or similar)
Data warehouses (BigQuery Redshift or equivalent)
Real-time streaming pipelines (Kinesis Kafka or similar)
Batch and serverless data processing
Strong analytical skills with the ability to work with both structured and unstructured data.
Experience in leading IT projects and managing stakeholder expectations.
Additional Information :
Discover some of the global benefits that empower our people to become the best version of themselves:
- Finance: Competitive salary package share plan company performance bonuses value-based recognition awards referral bonus;
- Career Development: Career coaching global career opportunities non-linear career paths internal development programmes for management and technical leadership;
- Learning Opportunities: Complex projects rotations internal tech communities training certifications coaching online learning platforms subscriptions pass-it-on sessions workshops conferences;
- Work-Life Balance: Hybrid work and flexible working hours employee assistance programme;
- Health: Global internal wellbeing programme access to wellbeing apps;
- Community: Global internal tech communities hobby clubs and interest groups inclusion and diversity programmes events and celebrations.
At Endava were committed to creating an open inclusive and respectful environment where everyone feels safe valued and empowered to be their best. We welcome applications from people of all backgrounds experiences and perspectivesbecause we know that inclusive teams help us deliver smarter more innovative solutions for our customers. Hiring decisions are based on merit skills qualifications and potential. If you need adjustments or support during the recruitment process please let us know.
Remote Work :
No
Employment Type :
Full-time
View more
View less