drjobs Kafka Streaming Platform Engineer

Kafka Streaming Platform Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Amsterdam - Netherlands

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Job Overview:

We are seeking an experienced Kafka Streaming Platform Engineer to design develop and manage real-time data streaming solutions using Apache Kafka. This role is essential in building and optimizing data pipelines ensuring system scalability and integrating Kafka with various systems. The engineer will play a key role in maintaining and optimizing Kafka clusters working with Kafka Connect Kafka Streams and other associated technologies.

You will collaborate with cross-functional teams including data architects DBAs and application developers and leverage your experience with DevOps practices (CI/CD Docker Kubernetes) for automated deployments and infrastructure management. Strong problem-solving skills and a solid understanding of event-driven architectures are essential for this role.


Key Responsibilities:

1. Data Pipeline Development

  • Design develop and maintain data pipelines using Apache Kafka including real-time streaming applications.

  • Build highly scalable data solutions that can handle large volumes of data in real-time.

2. Kafka Cluster Management

  • Install configure monitor and optimize Kafka clusters ensuring high availability and efficient resource utilization.

  • Manage Kafka Connect and ensure smooth integration with external systems (databases messaging systems etc.).

3. Event-Driven Architecture Implementation

  • Design and implement event-driven architectures using Kafka Streams Kafka Connect KSQL and other Kafka-related technologies.

  • Leverage pub/sub patterns to enable real-time event processing and improve system responsiveness.

4. Integration with Other Systems

  • Integrate Kafka with various databases (relational NoSQL) applications and other data processing platforms.

  • Utilize Change Data Capture (CDC) techniques using Kafka Connect Debezium or custom connectors to enable real-time data sync across platforms.

5. Performance Optimization

  • Ensure data ingestion and transformation pipelines are optimized for performance reliability and scalability.

  • Continuously monitor and improve Kafka cluster performance minimizing latency and maximizing throughput.

6. Troubleshooting and Monitoring

  • Proactively monitor Kafka clusters to identify and resolve any performance or availability issues.

  • Troubleshoot complex data streaming issues and ensure high uptime and system stability.

7. Code Quality and Best Practices

  • Apply software engineering best practices such as object-oriented programming (OOP) TDD and design patterns to ensure code maintainability and quality.

  • Write efficient clean and well-documented code to meet project requirements.

8. DevOps Practices

  • Utilize DevOps methodologies (CI/CD Docker Kubernetes) for automated deployments and infrastructure management.

  • Automate infrastructure provisioning Kafka cluster scaling and deployment workflows.

9. Documentation

  • Create and maintain technical documentation including data flow diagrams design documents and operational procedures.

  • Ensure that all processes and architecture are well-documented and easily understood by the team.

10. Collaboration and Stakeholder Communication

  • Collaborate with cross-functional teams including data architects DBAs and application developers to align on project goals and technical solutions.

  • Provide technical support and guidance to team members ensuring high-quality implementations.



Requirements

Required Skills and Experience:

  • 8 years of experience with Apache Kafka including expertise in Kafka Connect Kafka Streams and Kafka brokers.

  • Strong proficiency in Java with experience in object-oriented programming (OOP).

  • Shell scripting and Python experience (desirable).

  • Solid understanding of event-driven architectures pub/sub patterns and real-time data processing.

  • Database knowledge: Experience with MongoDB relational databases or other data storage solutions.

  • Experience with cloud platforms such as AWS Azure or GCP.

  • Solid experience with DevOps practices including CI/CD Docker and Kubernetes.

  • Strong troubleshooting and problem-solving skills with an ability to identify and resolve complex issues in Kafka clusters.

  • Schema registry experience and familiarity with tools related to Kafka (e.g. Confluent Schema Registry Control Center) is a plus.

  • Experience with Change Data Capture (CDC) Debezium or similar tools is highly desirable.



Required Skills and Experience: 8+ years of experience with Apache Kafka, including expertise in Kafka Connect, Kafka Streams, and Kafka brokers. Strong proficiency in Java, with experience in object-oriented programming (OOP). Shell scripting and Python experience (desirable). Solid understanding of event-driven architectures, pub/sub patterns, and real-time data processing. Database knowledge: Experience with MongoDB, relational databases, or other data storage solutions. Experience with cloud platforms such as AWS, Azure, or GCP. Solid experience with DevOps practices including CI/CD, Docker, and Kubernetes. Strong troubleshooting and problem-solving skills, with an ability to identify and resolve complex issues in Kafka clusters. Schema registry experience and familiarity with tools related to Kafka (e.g., Confluent, Schema Registry, Control Center) is a plus. Experience with Change Data Capture (CDC), Debezium, or similar tools is highly desirable.

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.