We are looking for an experienced Backend Engineer to join a well-established development team working on large-scale event-driven systems.
Our client is an integrated shipping services company that has established itself as an independent carrier with a fresh and innovative approach to global logistics. The company operates a smart and efficient worldwide network delivering reliable shipping services and ensuring stable operations for customers across the globe.
In this role you will design and build event-driven solutions using Kafka and related technologies focusing on stream processing applications data pipelines and monitoring solutions built with the ELK stack.
Responsibilities:
- Design and develop event-driven architectures using Apache Kafka and Kafka Streams;
- Build and maintain stream processing applications and data pipelines;
- Design and implement ELK pipelines indexes dashboards and alerts;
- Lead architecture discussions and contribute to the design of Kafka-based event streaming systems;
- Configure Kafka connectors and manage Kafka topics;
- Develop scalable event-based processing applications according to architecture and design specifications;
- Support development and QA teams throughout the software development lifecycle;
- Contribute to monitoring observability and reliability of streaming platforms.
Requirements:
- 4 years of experience building Kafka-based applications;
- Strong experience with Kafka ecosystem components:
- Kafka Streams (including Processor API)
- Kafka Connect
- Schema Registry
- 4 years of experience with Java and Spring implementing Kafka Streams applications;
- Hands-on experience with the ELK stack (Elasticsearch Logstash Kibana) for monitoring and observability;
- Experience with Logstash Configuration Language (LCL) for configuring pipelines (inputs filters outputs);
- Solid knowledge of SQL and Oracle PL/SQL including query optimization;
- Experience working with REST APIs;
- Experience using Kafka CLI tools and REST APIs for managing Kafka resources;
- Experience working in Linux/Unix environments;
- Experience with IntelliJ IDEA;
- Experience contributing to DevOps processes for Kafka and ELK.
Nice to Have:
- Experience managing and maintaining ELK servers;
- Experience with Confluent Kafka Cloud;
- Experience with Python;
- Exposure to AI/ML-related projects.
We offer*:
- Flexible working format - remote office-based or flexible
- A competitive salary and good compensation package
- Personalized career growth
- Professional development tools (mentorship program tech talks and trainings centers of excellence and more)
- Active tech communities with regular knowledge sharing
- Education reimbursement
- Memorable anniversary presents
- Corporate events and team buildings
- Other location-specific benefits
*not applicable for freelancers
Required Experience:
Senior IC
We are looking for an experienced Backend Engineer to join a well-established development team working on large-scale event-driven systems.Our client is an integrated shipping services company that has established itself as an independent carrier with a fresh and innovative approach to global logist...
We are looking for an experienced Backend Engineer to join a well-established development team working on large-scale event-driven systems.
Our client is an integrated shipping services company that has established itself as an independent carrier with a fresh and innovative approach to global logistics. The company operates a smart and efficient worldwide network delivering reliable shipping services and ensuring stable operations for customers across the globe.
In this role you will design and build event-driven solutions using Kafka and related technologies focusing on stream processing applications data pipelines and monitoring solutions built with the ELK stack.
Responsibilities:
- Design and develop event-driven architectures using Apache Kafka and Kafka Streams;
- Build and maintain stream processing applications and data pipelines;
- Design and implement ELK pipelines indexes dashboards and alerts;
- Lead architecture discussions and contribute to the design of Kafka-based event streaming systems;
- Configure Kafka connectors and manage Kafka topics;
- Develop scalable event-based processing applications according to architecture and design specifications;
- Support development and QA teams throughout the software development lifecycle;
- Contribute to monitoring observability and reliability of streaming platforms.
Requirements:
- 4 years of experience building Kafka-based applications;
- Strong experience with Kafka ecosystem components:
- Kafka Streams (including Processor API)
- Kafka Connect
- Schema Registry
- 4 years of experience with Java and Spring implementing Kafka Streams applications;
- Hands-on experience with the ELK stack (Elasticsearch Logstash Kibana) for monitoring and observability;
- Experience with Logstash Configuration Language (LCL) for configuring pipelines (inputs filters outputs);
- Solid knowledge of SQL and Oracle PL/SQL including query optimization;
- Experience working with REST APIs;
- Experience using Kafka CLI tools and REST APIs for managing Kafka resources;
- Experience working in Linux/Unix environments;
- Experience with IntelliJ IDEA;
- Experience contributing to DevOps processes for Kafka and ELK.
Nice to Have:
- Experience managing and maintaining ELK servers;
- Experience with Confluent Kafka Cloud;
- Experience with Python;
- Exposure to AI/ML-related projects.
We offer*:
- Flexible working format - remote office-based or flexible
- A competitive salary and good compensation package
- Personalized career growth
- Professional development tools (mentorship program tech talks and trainings centers of excellence and more)
- Active tech communities with regular knowledge sharing
- Education reimbursement
- Memorable anniversary presents
- Corporate events and team buildings
- Other location-specific benefits
*not applicable for freelancers
Required Experience:
Senior IC
View more
View less