ARHS Group - Part of Accenture is looking for a Kafka DevOps Engineer (m/f) to join its team at the client.
As a Kafka DevOps Engineer (m/f) you will be responsible for implementing and maintaining Kafka infrastructure to ensure high availability scalability and performance.
Key Responsibilities:
- Deploy configure monitor and maintain Kafka clusters in a high-availability production environment
- Tune Kafka configurations partitions replication and producers/consumers to ensure efficient message streaming
- Automate Kafka infrastructure deployment and management using Terraform Ansible or similar tools
- Implement robust monitoring solutions (e.g. Dynatrace) and troubleshoot performance bottlenecks latency issues and failures
- Ensure secure data transmission access control and compliance with security best practices (SSL/TLS RBAC Kerberos)
- Integrate Kafka with CI/CD pipelines and automate deployment processes to improve efficiency and reliability
- Analyze workloads and plan for horizontal scaling resource optimization and failover strategies
- Work closely with development teams to support Kafka-based applications and ensure seamless data flow
- Training and technical support to end users and other stakeholders
Your profile:
- 5 years of experience in DevOps Site Reliability Engineering (SRE) or Kafka administration
- Strong hands-on experience with Apache Kafka (setup tuning and troubleshooting)
- Proficiency in scripting (Python Bash) and automation tools (Terraform Ansible)
- Experience with cloud environments (AWS Azure or GCP) and Kubernetes-based Kafka deployments
- Familiarity with Kafka Connect KSQL Schema Registry Zookeeper
- Knowledge of logging and monitoring tools (Dynatrace ELK Splunk)
- Understanding of networking security and access control for Kafka clusters
- Experience with CI/CD tools (Jenkins GitLab ArgoCD)
- Ability to analyze logs debug issues and propose proactive improvements
- An ITIL qualification is an asset
- Experience with Confluent Kafka or other managed Kafka solutions
- Knowledge of event-driven architectures and stream processing (Flink Spark Kafka Streams)
- Experience with service mesh technologies (Istio Linkerd) for Kafka networking is a plus
- Certifications in Kafka Kubernetes or cloud platforms is a plus
- Fluency in English (written and spoken) is required.
Remote Work :
No
Employment Type :
Full-time
ARHS Group - Part of Accenture is looking for a Kafka DevOps Engineer (m/f) to join its team at the client.As a Kafka DevOps Engineer (m/f) you will be responsible for implementing and maintaining Kafka infrastructure to ensure high availability scalability and performance.Key Responsibilities:Deplo...
ARHS Group - Part of Accenture is looking for a Kafka DevOps Engineer (m/f) to join its team at the client.
As a Kafka DevOps Engineer (m/f) you will be responsible for implementing and maintaining Kafka infrastructure to ensure high availability scalability and performance.
Key Responsibilities:
- Deploy configure monitor and maintain Kafka clusters in a high-availability production environment
- Tune Kafka configurations partitions replication and producers/consumers to ensure efficient message streaming
- Automate Kafka infrastructure deployment and management using Terraform Ansible or similar tools
- Implement robust monitoring solutions (e.g. Dynatrace) and troubleshoot performance bottlenecks latency issues and failures
- Ensure secure data transmission access control and compliance with security best practices (SSL/TLS RBAC Kerberos)
- Integrate Kafka with CI/CD pipelines and automate deployment processes to improve efficiency and reliability
- Analyze workloads and plan for horizontal scaling resource optimization and failover strategies
- Work closely with development teams to support Kafka-based applications and ensure seamless data flow
- Training and technical support to end users and other stakeholders
Your profile:
- 5 years of experience in DevOps Site Reliability Engineering (SRE) or Kafka administration
- Strong hands-on experience with Apache Kafka (setup tuning and troubleshooting)
- Proficiency in scripting (Python Bash) and automation tools (Terraform Ansible)
- Experience with cloud environments (AWS Azure or GCP) and Kubernetes-based Kafka deployments
- Familiarity with Kafka Connect KSQL Schema Registry Zookeeper
- Knowledge of logging and monitoring tools (Dynatrace ELK Splunk)
- Understanding of networking security and access control for Kafka clusters
- Experience with CI/CD tools (Jenkins GitLab ArgoCD)
- Ability to analyze logs debug issues and propose proactive improvements
- An ITIL qualification is an asset
- Experience with Confluent Kafka or other managed Kafka solutions
- Knowledge of event-driven architectures and stream processing (Flink Spark Kafka Streams)
- Experience with service mesh technologies (Istio Linkerd) for Kafka networking is a plus
- Certifications in Kafka Kubernetes or cloud platforms is a plus
- Fluency in English (written and spoken) is required.
Remote Work :
No
Employment Type :
Full-time
View more
View less