*Location : Sweden or Amsterdam. Onsite 2-3 days/week
*40 hours per week*
*English speaking only*
As a data and analytics team within the customer domain we need data engineering support to transform existing data into data products to enable the data mesh. The work will include ingestion and transformation pipelines data quality checks modeling and the setup of proper access management. This assignment will also involve stakeholder management with product and software engineering documentation to ensure discover ability process improvement and review of our current setup regarding tech stack best practices and security.
The scope of the consultant services is to assist in:
Build and improve data products and organise our teams work together with other data engineers and data stewards based in both Amsterdam and Sweden. For this role specific technical skills are crucial but interpersonal skills are also very important.
Requirements
Requirements:
- Data products and data mesh knowledge: experience with practical implementation from levels and schemas to data contracts and data access management
- BigQuery: advanced experience ideally in-depth Analytics Hub experience for data access management
- Communication and interpersonal skills to help advocate for the data mesh but also to guide colleagues technically
- GCP in general: especially tools used for data pipelines like Cloud Run (jobs) Workflows; Dataflow is a big plus
- CI/CD and DataOps: GitHub Actions Observability/Monitoring IaC with Terraform
- dbt: for data modelling and transformation pipelines
- SQL: advanced skills for pipeline optimisation
- Stakeholder management (software engineers POs)
Most important:
- Data products and data mesh knowledge
- BigQuery
- Communication and interpersonal skills
Requirements: Data products and data mesh knowledge: experience with practical implementation from levels and schemas to data contracts and data access management BigQuery: advanced experience, ideally in-depth Analytics Hub experience for data access management Communication and interpersonal skills to help advocate for the data mesh, but also to guide colleagues technically GCP in general: especially tools used for data pipelines like Cloud Run (jobs), Workflows; Dataflow is a big plus CI/CD and DataOps: GitHub Actions, Observability/Monitoring, IaC with Terraform dbt: for data modelling and transformation pipelines SQL: advanced skills for pipeline optimisation, Stakeholder management (software engineers, POs) Most important: Data products and data mesh knowledge BigQuery Communication and interpersonal skills