Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailNot Disclosed
Salary Not Disclosed
1 Vacancy
About the Role
We are looking for a Data Engineer to help build and manage robust data pipelines and platforms that power advanced analytics and AI solutions.
Key Responsibilities
Develop and maintain batch and streaming data pipelines using GCP services such as BigQuery Dataflow Dataproc Composer Dataform Cloud Functions.
Load transform and optimize data in BigQuery for analytics and reporting.
Integrate data from multiple sources including APIs databases and files.
Assist in data migration from legacy systems such as Oracle and MicroStrategy.
Ensure data quality governance and security compliance.
Collaborate with analysts and business teams to support reporting needs.
Requirements
35 years experience in data engineering or ETL development.
Hands-on experience with GCP Data Stack (BigQuery Dataflow Composer Dataproc).
Solid SQL and Python skills.
Familiarity with Azure Data Stack is a plus.
Understanding of data modelling concepts and performance optimization.
Willingness to learn and work on large-scale migration projects.
Full Time