Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via email8-10years
Not Disclosed
Salary Not Disclosed
1 Vacancy
As a Kafka Streaming Platform Engineer you will:
Let s Connect
Want to discuss this opportunity in more detail Feel free to reach out.
Recruiter: Mary Jesima
Phone:; Extn : 132
Email:
LinkedIn: As a Kafka Streaming Platform Engineer, you will: Design, develop, and maintain real-time data pipelines using Apache Kafka, including Kafka Streams, Kafka Connect, and other related tools. Manage and optimize Kafka clusters: install, configure, monitor, and ensure high availability and performance. Implement event-driven architectures using Kafka, KSQL, and pub/sub models. Integrate Kafka with diverse systems such as relational databases, MongoDB, and modern data platforms. Ensure scalability, reliability, and performance in data ingestion and transformation pipelines. Develop and maintain Change Data Capture (CDC) pipelines using Kafka Connect, Debezium, or custom connectors. Apply best software engineering practices (OOP, TDD, design patterns) to ensure high-quality, maintainable code. Collaborate closely with data architects, DBAs, developers, and DevOps teams across the delivery lifecycle. Utilize DevOps tools (CI/CD, Docker, Kubernetes) for infrastructure automation and deployment. Document technical designs, data flow diagrams, and operational procedures clearly and effectively. What You Bring to the Table: 8 10 years of professional experience in backend engineering and data streaming, with at least 5+ years specifically in Apache Kafka. Deep expertise in Kafka Connect, Kafka Streams, and managing Kafka brokers. Strong programming skills in Java and familiarity with object-oriented programming. Scripting experience in Shell or Python is a strong plus. Solid understanding of event-driven architecture, CDC, and pub/sub messaging patterns. Experience with schema registries, relational databases, and NoSQL databases like MongoDB. Exposure to cloud platforms (AWS, Azure, or GCP) and containerized deployments. A track record of working in Agile environments and cross-functional teams. You should possess the ability to: Architect and implement efficient and scalable data streaming solutions. Troubleshoot complex Kafka issues and proactively ensure system stability. Work collaboratively with development, data, and operations teams. Maintain a strong focus on code quality, performance, and documentation. Communicate effectively and adapt to changing priorities and environments. What We Bring to the Table: The opportunity to contribute to critical, high-performance data streaming solutions in a modern tech stack. A collaborative team culture where innovation, reliability, and continuous learning are valued. A supportive team and access to continuous learning and development programs. Let s Connect Want to discuss this opportunity in more detail? Feel free to reach out. Recruiter: Mary Jesima Phone: ; Extn : 132 Email: LinkedIn:
Full Time