Role Overview
The Confluent Kafka Lead / Python Developer is responsible for designing building and operating enterprise-grade event streaming solutions using Confluent Kafka while developing Python-based producers consumers and streaming applications. This role blends deep hands on development with technical leadership ensuring scalable reliable and secure real-time data flows across distributed systems.
The position plays a key role in event-driven architecture (EDA) data platform modernization and real-time analytics initiatives.
Job location: Columbus OH or Nashville TN (both locations would be for onsite work)
Key Responsibilities
Kafka & Streaming Platform Leadership
- Lead the design and implementation of enterprise Kafka and Confluent Platform solutions (Kafka Schema Registry Connect ksqlDB).
- Define and enforce topic design partitioning retention and schema evolution standards.
- Act as technical owner for Kafka clusters across dev test and production environments.
- Drive best practices for high availability fault tolerance and scalability.
- Python Development (Producers & Consumers)
- Design and develop Python-based Kafka producers and consumers using Confluent Kafka Python APIs.
- Build event-driven microservices and streaming applications in Python.
- Implement message serialization and schema validation (Avro JSON Protobuf).
- Handle idempotency retries back-pressure and error handling patterns.
- Event-Driven Architecture & Integration
- Design event-driven integration patterns bridging microservices data stores APIs and third party systems.
- Integrate Kafka with downstream consumers such as databases data lakes analytics platforms and search systems.
- Support real-time pipelines for transactions telemetry customer events and analytics.
- Collaborate with API data and application teams to align event contracts.
Security & Governance
- Implement Kafka security controls:
- TLS encryption
- SASL / OAuth authentication
- ACL-based authorization
- Enforce data governance schema compatibility rules and event ownership models.
- Ensure compliance with enterprise security and regulatory standards.
DevOps Automation & Observability
- Build and maintain CI/CD pipelines for Kafka-related applications and configurations.
- Use Infrastructure as Code to provision and manage Kafka infrastructure.
- Implement monitoring and alerting using tools such as Confluent Control Center Prometheus Grafana or cloud-native equivalents.
- Troubleshoot production streaming issues related to latency lag throughput or data loss.
- Technical Leadership
- Serve as Kafka subject matter expert and technical lead.
- Mentor developers on event-driven design and streaming best practices.
- Review designs and code for Kafka and Python-based streaming solutions.
- Partner with architects SREs and platform teams on roadmap and capacity planning.
Required Qualifications
Kafka & Streaming Experience
- 6 10 years of experience in software or data engineering.
- 4 years of hands-on experience with Apache Kafka and/or Confluent Platform.
- Strong knowledge of:
- Kafka internals (brokers partitions offsets consumer groups)
- Schema Registry and schema evolution
- Kafka Connect architectures and connectors
Python Development
- Strong proficiency in Python for backend and streaming development.
- Experience building production-grade services using Python frameworks and libraries.
- Familiarity with async processing multithreading or stream processing patterns is a plus.
Cloud & DevOps
- Experience deploying Kafka and applications in cloud or hybrid environments (AWS Azure GCP).
- CI/CD pipeline experience (GitHub Actions Jenkins GitLab Azure DevOps).
- Infrastructure-as-Code experience (Terraform CloudFormation ARM/Bicep).
- Containerization experience (Docker Kubernetes) preferred.
Preferred Qualifications
- Experience with ksqlDB Kafka Streams or stream processing frameworks (Flink Spark Streaming).
- Exposure to event sourcing or CQRS patterns.
- Integration of Kafka with data lakes warehouses and analytics platforms.
- Confluent or cloud platform certifications.
- Experience supporting high-throughput low-latency systems.
Soft Skills & Leadership
- Strong communication skills across engineering and stakeholder teams.
- Ability to translate business use cases into event-driven technical solutions.
- Comfortable acting as both hands-on developer and technical lead.
- Experience influencing architecture and standards across teams.
Role Overview The Confluent Kafka Lead / Python Developer is responsible for designing building and operating enterprise-grade event streaming solutions using Confluent Kafka while developing Python-based producers consumers and streaming applications. This role blends deep hands on development with...
Role Overview
The Confluent Kafka Lead / Python Developer is responsible for designing building and operating enterprise-grade event streaming solutions using Confluent Kafka while developing Python-based producers consumers and streaming applications. This role blends deep hands on development with technical leadership ensuring scalable reliable and secure real-time data flows across distributed systems.
The position plays a key role in event-driven architecture (EDA) data platform modernization and real-time analytics initiatives.
Job location: Columbus OH or Nashville TN (both locations would be for onsite work)
Key Responsibilities
Kafka & Streaming Platform Leadership
- Lead the design and implementation of enterprise Kafka and Confluent Platform solutions (Kafka Schema Registry Connect ksqlDB).
- Define and enforce topic design partitioning retention and schema evolution standards.
- Act as technical owner for Kafka clusters across dev test and production environments.
- Drive best practices for high availability fault tolerance and scalability.
- Python Development (Producers & Consumers)
- Design and develop Python-based Kafka producers and consumers using Confluent Kafka Python APIs.
- Build event-driven microservices and streaming applications in Python.
- Implement message serialization and schema validation (Avro JSON Protobuf).
- Handle idempotency retries back-pressure and error handling patterns.
- Event-Driven Architecture & Integration
- Design event-driven integration patterns bridging microservices data stores APIs and third party systems.
- Integrate Kafka with downstream consumers such as databases data lakes analytics platforms and search systems.
- Support real-time pipelines for transactions telemetry customer events and analytics.
- Collaborate with API data and application teams to align event contracts.
Security & Governance
- Implement Kafka security controls:
- TLS encryption
- SASL / OAuth authentication
- ACL-based authorization
- Enforce data governance schema compatibility rules and event ownership models.
- Ensure compliance with enterprise security and regulatory standards.
DevOps Automation & Observability
- Build and maintain CI/CD pipelines for Kafka-related applications and configurations.
- Use Infrastructure as Code to provision and manage Kafka infrastructure.
- Implement monitoring and alerting using tools such as Confluent Control Center Prometheus Grafana or cloud-native equivalents.
- Troubleshoot production streaming issues related to latency lag throughput or data loss.
- Technical Leadership
- Serve as Kafka subject matter expert and technical lead.
- Mentor developers on event-driven design and streaming best practices.
- Review designs and code for Kafka and Python-based streaming solutions.
- Partner with architects SREs and platform teams on roadmap and capacity planning.
Required Qualifications
Kafka & Streaming Experience
- 6 10 years of experience in software or data engineering.
- 4 years of hands-on experience with Apache Kafka and/or Confluent Platform.
- Strong knowledge of:
- Kafka internals (brokers partitions offsets consumer groups)
- Schema Registry and schema evolution
- Kafka Connect architectures and connectors
Python Development
- Strong proficiency in Python for backend and streaming development.
- Experience building production-grade services using Python frameworks and libraries.
- Familiarity with async processing multithreading or stream processing patterns is a plus.
Cloud & DevOps
- Experience deploying Kafka and applications in cloud or hybrid environments (AWS Azure GCP).
- CI/CD pipeline experience (GitHub Actions Jenkins GitLab Azure DevOps).
- Infrastructure-as-Code experience (Terraform CloudFormation ARM/Bicep).
- Containerization experience (Docker Kubernetes) preferred.
Preferred Qualifications
- Experience with ksqlDB Kafka Streams or stream processing frameworks (Flink Spark Streaming).
- Exposure to event sourcing or CQRS patterns.
- Integration of Kafka with data lakes warehouses and analytics platforms.
- Confluent or cloud platform certifications.
- Experience supporting high-throughput low-latency systems.
Soft Skills & Leadership
- Strong communication skills across engineering and stakeholder teams.
- Ability to translate business use cases into event-driven technical solutions.
- Comfortable acting as both hands-on developer and technical lead.
- Experience influencing architecture and standards across teams.
View more
View less