drjobs Kafka Streaming Platform Engineer (ID: 3089)

Kafka Streaming Platform Engineer (ID: 3089)

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Jobs by Experience drjobs

8-10years

Job Location drjobs

Amsterdam - Netherlands

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

As a Kafka Streaming Platform Engineer you will:

  • Design develop and maintain real-time data pipelines using Apache Kafka including Kafka Streams Kafka Connect and other related tools.
  • Manage and optimize Kafka clusters: install configure monitor and ensure high availability and performance.
  • Implement event-driven architectures using Kafka KSQL and pub/sub models.
  • Integrate Kafka with diverse systems such as relational databases MongoDB and modern data platforms.
  • Ensure scalability reliability and performance in data ingestion and transformation pipelines.
  • Develop and maintain Change Data Capture (CDC) pipelines using Kafka Connect Debezium or custom connectors.
  • Apply best software engineering practices (OOP TDD design patterns) to ensure high-quality maintainable code.
  • Collaborate closely with data architects DBAs developers and DevOps teams across the delivery lifecycle.
  • Utilize DevOps tools (CI/CD Docker Kubernetes) for infrastructure automation and deployment.
  • Document technical designs data flow diagrams and operational procedures clearly and effectively.
What You Bring to the Table:
  • 8 10 years of professional experience in backend engineering and data streaming with at least 5 years specifically in Apache Kafka.
  • Deep expertise in Kafka Connect Kafka Streams and managing Kafka brokers.
  • Strong programming skills in Java and familiarity with object-oriented programming.
  • Scripting experience in Shell or Python is a strong plus.
  • Solid understanding of event-driven architecture CDC and pub/sub messaging patterns.
  • Experience with schema registries relational databases and NoSQL databases like MongoDB.
  • Exposure to cloud platforms (AWS Azure or GCP) and containerized deployments.
  • A track record of working in Agile environments and cross-functional teams.
You should possess the ability to:
  • Architect and implement efficient and scalable data streaming solutions.
  • Troubleshoot complex Kafka issues and proactively ensure system stability.
  • Work collaboratively with development data and operations teams.
  • Maintain a strong focus on code quality performance and documentation.
  • Communicate effectively and adapt to changing priorities and environments.
What We Bring to the Table:
  • The opportunity to contribute to critical high-performance data streaming solutions in a modern tech stack.
  • A collaborative team culture where innovation reliability and continuous learning are valued.
  • A supportive team and access to continuous learning and development programs.


Let s Connect

Want to discuss this opportunity in more detail Feel free to reach out.

Recruiter: Mary Jesima
Phone:; Extn : 132
Email:
LinkedIn:
As a Kafka Streaming Platform Engineer, you will: Design, develop, and maintain real-time data pipelines using Apache Kafka, including Kafka Streams, Kafka Connect, and other related tools. Manage and optimize Kafka clusters: install, configure, monitor, and ensure high availability and performance. Implement event-driven architectures using Kafka, KSQL, and pub/sub models. Integrate Kafka with diverse systems such as relational databases, MongoDB, and modern data platforms. Ensure scalability, reliability, and performance in data ingestion and transformation pipelines. Develop and maintain Change Data Capture (CDC) pipelines using Kafka Connect, Debezium, or custom connectors. Apply best software engineering practices (OOP, TDD, design patterns) to ensure high-quality, maintainable code. Collaborate closely with data architects, DBAs, developers, and DevOps teams across the delivery lifecycle. Utilize DevOps tools (CI/CD, Docker, Kubernetes) for infrastructure automation and deployment. Document technical designs, data flow diagrams, and operational procedures clearly and effectively. What You Bring to the Table: 8 10 years of professional experience in backend engineering and data streaming, with at least 5+ years specifically in Apache Kafka. Deep expertise in Kafka Connect, Kafka Streams, and managing Kafka brokers. Strong programming skills in Java and familiarity with object-oriented programming. Scripting experience in Shell or Python is a strong plus. Solid understanding of event-driven architecture, CDC, and pub/sub messaging patterns. Experience with schema registries, relational databases, and NoSQL databases like MongoDB. Exposure to cloud platforms (AWS, Azure, or GCP) and containerized deployments. A track record of working in Agile environments and cross-functional teams. You should possess the ability to: Architect and implement efficient and scalable data streaming solutions. Troubleshoot complex Kafka issues and proactively ensure system stability. Work collaboratively with development, data, and operations teams. Maintain a strong focus on code quality, performance, and documentation. Communicate effectively and adapt to changing priorities and environments. What We Bring to the Table: The opportunity to contribute to critical, high-performance data streaming solutions in a modern tech stack. A collaborative team culture where innovation, reliability, and continuous learning are valued. A supportive team and access to continuous learning and development programs. Let s Connect Want to discuss this opportunity in more detail? Feel free to reach out. Recruiter: Mary Jesima Phone: ; Extn : 132 Email: LinkedIn:

Employment Type

Full Time

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.