Job Title: DataOps / DevOps Specialist
Location: Hybrid (Stockholm Sweden mix of office and remote work)
Start Date: ASAP
Utilization: 100%
Language: English (Swedish is a plus)
About Client:
We are a progressive tech company powered by innovation and collaboration. From COBOL to AI we offer one of the most diverse tech ecosystems in the Nordics. Together were building sustainable technology for the future an inclusive forward-thinking environment where every idea matters.
Are you ready to take on a new challenge and help shape Our next-generation data platform
About the Assignment:
Our client is modernizing its data platform to ensure a future-proof compliant and scalable foundation for its data-driven operations. The current on-premise data lake (based on Cloudera-supported Apache technologies such as Spark NiFi and Airflow) will be migrated to a modern cloud-native architecture on Google Cloud Platform (GCP).
We are looking for a DataOps / DevOps Specialist who will play a key role in supporting this migration. You will be part of the DataOps DCA Tribe contributing hands-on to architecture design migration activities automation and continuous delivery pipelines helping build a secure efficient and sustainable platform for the future.
Key Responsibilities:
- Support the migration of on-premise data lake to Google Cloud Platform (GCP).
- Design build and maintain data pipelines and DevOps automation workflows.
- Implement containerized environments using Docker and Kubernetes.
- Apply Infrastructure as Code (IaC) principles using Terraform.
- Optimize data processing using Apache Spark NiFi and other Apache ecosystem tools.
- Collaborate with teams on CI/CD pipelines monitoring and observability improvements.
- Ensure compliance with regulatory and data security standards.
- Contribute to platform performance optimization and documentation.
Required Qualifications:
- 48 years of experience in DataOps or DevOps roles.
- Strong hands-on experience with:
- Apache Spark
- Docker and Kubernetes
- Infrastructure as Code (Terraform)
- Apache NiFi
- Python and SQL
- ETL processes and database management
- Cloudera ecosystem
- Scala
Meritorious Skills:
Cloud & Modern Stack:
- Google Cloud Platform (GCP) especially Dataproc
- Observability and monitoring tools (e.g. Prometheus Kibana Elasticsearch)
- CI/CD pipelines
- YAML Linux Windows
- GCP Event Hubs
- Command-line interface (CLI) proficiency
On-Premise Experience:
- Apache Airflow
- Data lake and legacy system management
Recruitment Partner: Sperton
This position is exclusively managed by Sperton a global talent partner connecting high-performing professionals with leading organizations worldwide.
Job Title: DataOps / DevOps Specialist Location: Hybrid (Stockholm Sweden mix of office and remote work)Start Date: ASAPUtilization: 100%Language: English (Swedish is a plus)About Client:We are a progressive tech company powered by innovation and collaboration. From COBOL to AI we offer one of the ...
Job Title: DataOps / DevOps Specialist
Location: Hybrid (Stockholm Sweden mix of office and remote work)
Start Date: ASAP
Utilization: 100%
Language: English (Swedish is a plus)
About Client:
We are a progressive tech company powered by innovation and collaboration. From COBOL to AI we offer one of the most diverse tech ecosystems in the Nordics. Together were building sustainable technology for the future an inclusive forward-thinking environment where every idea matters.
Are you ready to take on a new challenge and help shape Our next-generation data platform
About the Assignment:
Our client is modernizing its data platform to ensure a future-proof compliant and scalable foundation for its data-driven operations. The current on-premise data lake (based on Cloudera-supported Apache technologies such as Spark NiFi and Airflow) will be migrated to a modern cloud-native architecture on Google Cloud Platform (GCP).
We are looking for a DataOps / DevOps Specialist who will play a key role in supporting this migration. You will be part of the DataOps DCA Tribe contributing hands-on to architecture design migration activities automation and continuous delivery pipelines helping build a secure efficient and sustainable platform for the future.
Key Responsibilities:
- Support the migration of on-premise data lake to Google Cloud Platform (GCP).
- Design build and maintain data pipelines and DevOps automation workflows.
- Implement containerized environments using Docker and Kubernetes.
- Apply Infrastructure as Code (IaC) principles using Terraform.
- Optimize data processing using Apache Spark NiFi and other Apache ecosystem tools.
- Collaborate with teams on CI/CD pipelines monitoring and observability improvements.
- Ensure compliance with regulatory and data security standards.
- Contribute to platform performance optimization and documentation.
Required Qualifications:
- 48 years of experience in DataOps or DevOps roles.
- Strong hands-on experience with:
- Apache Spark
- Docker and Kubernetes
- Infrastructure as Code (Terraform)
- Apache NiFi
- Python and SQL
- ETL processes and database management
- Cloudera ecosystem
- Scala
Meritorious Skills:
Cloud & Modern Stack:
- Google Cloud Platform (GCP) especially Dataproc
- Observability and monitoring tools (e.g. Prometheus Kibana Elasticsearch)
- CI/CD pipelines
- YAML Linux Windows
- GCP Event Hubs
- Command-line interface (CLI) proficiency
On-Premise Experience:
- Apache Airflow
- Data lake and legacy system management
Recruitment Partner: Sperton
This position is exclusively managed by Sperton a global talent partner connecting high-performing professionals with leading organizations worldwide.
View more
View less