Responsibilities:
- Design build and maintain data pipelines on Google Cloud Platform (GCP) using services such as BigQuery Dataplex Dataflow/Dataproc Pub/Sub Cloud Functions Cloud Run Cloud Scheduler and Cloud Workflows.
- Develop and implement data models using dbt (data build tool) to support analytical reporting and data science needs.
- Collaborate with data analysts and business stakeholders to understand data requirements and translate them into data models.
- Implement data quality checks and monitoring to ensure high quality of data in our data warehouse.
- Optimize data pipelines and data models for performance and cost efficiency.
- Document and communicate data models to the wider team and train them on how to use it effectively.
- Good knowledge of Terraform
- Experience in building CI/CD pipelines using GitHub Actions.
Qualifications :
- Bachelors degree in Computer Science Information Systems or a related field or equivalent experience.
- Proven experience as a Data Engineer or in a similar role.
- Strong experience with Google Cloud Platform (GCP) and its data services such as BigQuery Cloud Run and Pub/Sub.
- Experience in data modelling and transformation using dbt.
- Knowledge of SQL and good experience with Python.
- Strong problemsolving skills and attention to detail.
- Excellent communication skills and the ability to explain complex topics in simple terms.
Additional Information :
Start: Immediate
Location: Stockholm Sweden
Form of employment: Fulltime until further notice we apply 6 months probationary employment.
We interview candidates on an ongoing basis do not wait to submit your application.
Locations: Stockholm
Note: Please apply only if you are currently residing in Sweden.
Remote Work :
No
Employment Type :
Fulltime