Data Engineer-Kafka
Important Information
Location: Kuala Lumpur
Experience: 5 years
Job Mode: Contract
Work Mode: On-site
Job Summary
Highly skilled Data Engineer with expertise in Apache Kafka to design build and maintain real-time data streaming platforms and pipelines. Will have strong experience in distributed systems data pipeline development and a deep understanding of Kafkas architecture and ecosystem. Will work closely with cross-functional teams to ensure scalable high-performance and reliable data flow across our organization.
Responsibilities & Duties
- Design develop and maintain real-time data pipelines using Apache Kafka and related technologies.
- Build robust scalable and fault-tolerant streaming data platforms.
- Configure and manage Kafka clusters topics partitions producers and consumers.
- Integrate Kafka with various data sources and sinks (databases APIs cloud storage etc.).
- Ensure data quality availability and consistency across streaming platforms.
- Monitor tune and optimize the performance and throughput of Kafka pipelines.
- Work with DevOps and infrastructure teams to automate deployment and scaling of Kafka environments.
- Implement data governance security and compliance policies for streaming systems.
- Collaborate with data scientists analysts and engineers to deliver high-impact data solutions.
- Troubleshoot production issues related to Kafka pipelines and implement preventative measures.
Qualifications & Skills
- Bachelors or Masters degree in Computer Science Engineering or related field.
- 5 years of experience in data engineering or backend development.
- Strong hands-on experience with Apache Kafka Kafka Streams Kafka Connect and Schema Registry.
- Proficient in programming languages such as Java Scala or Python.
- Deep understanding of distributed systems and stream processing architectures.
- Experience with messaging systems and event-driven architectures.
- Strong understanding of SQL and NoSQL databases.
- Experience with CI/CD version control (Git) and cloud infrastructure (AWS Azure or GCP).
About Encora
Encora is the preferred digital engineering and modernization partner of some of the worlds leading enterprises and digital native companies. With over 9000 experts in 47 offices and innovation labs worldwide Encoras technology practices include Product Engineering & Development Cloud Services Quality Engineering DevSecOps Data & Analytics Digital Experience Cybersecurity and AI & LLM Engineering.
At Encora we hire professionals based solely on their skills and qualifications and do not discriminate based on age disability religion gender sexual orientation socioeconomic status or nationality.
Data Engineer-KafkaImportant InformationLocation: Kuala LumpurExperience: 5 yearsJob Mode: ContractWork Mode: On-siteJob SummaryHighly skilled Data Engineer with expertise in Apache Kafka to design build and maintain real-time data streaming platforms and pipelines. Will have strong experience in di...
Data Engineer-Kafka
Important Information
Location: Kuala Lumpur
Experience: 5 years
Job Mode: Contract
Work Mode: On-site
Job Summary
Highly skilled Data Engineer with expertise in Apache Kafka to design build and maintain real-time data streaming platforms and pipelines. Will have strong experience in distributed systems data pipeline development and a deep understanding of Kafkas architecture and ecosystem. Will work closely with cross-functional teams to ensure scalable high-performance and reliable data flow across our organization.
Responsibilities & Duties
- Design develop and maintain real-time data pipelines using Apache Kafka and related technologies.
- Build robust scalable and fault-tolerant streaming data platforms.
- Configure and manage Kafka clusters topics partitions producers and consumers.
- Integrate Kafka with various data sources and sinks (databases APIs cloud storage etc.).
- Ensure data quality availability and consistency across streaming platforms.
- Monitor tune and optimize the performance and throughput of Kafka pipelines.
- Work with DevOps and infrastructure teams to automate deployment and scaling of Kafka environments.
- Implement data governance security and compliance policies for streaming systems.
- Collaborate with data scientists analysts and engineers to deliver high-impact data solutions.
- Troubleshoot production issues related to Kafka pipelines and implement preventative measures.
Qualifications & Skills
- Bachelors or Masters degree in Computer Science Engineering or related field.
- 5 years of experience in data engineering or backend development.
- Strong hands-on experience with Apache Kafka Kafka Streams Kafka Connect and Schema Registry.
- Proficient in programming languages such as Java Scala or Python.
- Deep understanding of distributed systems and stream processing architectures.
- Experience with messaging systems and event-driven architectures.
- Strong understanding of SQL and NoSQL databases.
- Experience with CI/CD version control (Git) and cloud infrastructure (AWS Azure or GCP).
About Encora
Encora is the preferred digital engineering and modernization partner of some of the worlds leading enterprises and digital native companies. With over 9000 experts in 47 offices and innovation labs worldwide Encoras technology practices include Product Engineering & Development Cloud Services Quality Engineering DevSecOps Data & Analytics Digital Experience Cybersecurity and AI & LLM Engineering.
At Encora we hire professionals based solely on their skills and qualifications and do not discriminate based on age disability religion gender sexual orientation socioeconomic status or nationality.
View more
View less