Job Role: Kafka Admin
Job Location: Philadelphia PA (100% Onsite)
Job Type: Contract
Roles and Responsibilities
- 8 years of hands-on experience with Kafka/Confluent in production.
- Strong expertise with:
- SSL/TLS end-to-end configuration in Kafka ecosystems
- RBAC authorization configuration and operational administration
- Designing for HA/redundancy and scaling for growth
- Monitoring/alerting with Prometheus & Grafana plus operational tooling such as New Relic
- Performance testing and tuning (producers/consumers brokers Connect infrastructure)
- Demonstrated experience implementing:
- Confluent Oracle Premium CDC Connector
- Confluent sink to ADLS Gen2 (ADLS2)
- Proficiency with Azure DevOps Git and building CI/CD pipelines.
- Working knowledge of Apache Flink and hands-on experience writing Kafka Streams.
Key Responsibilities:
- Security & Access Control
- Configure end-to-end SSL/TLS across Kafka/Confluent components and client integrations.
- Implement and manage RBAC for authorizations service accounts and least-privilege access.
- High Availability Redundancy & Failover
- Configure core components for redundancy and failover resilience (brokers/controllers Connect Schema Registry etc.).
- Design and implement a Kafka disaster recovery (DR) cluster including replication strategy failover testing and runbooks aligned to RPO/RTO.
- Scale & Future Growth
- Plan and implement platform scalability for future growth (topic/partition strategy retention throughput capacity planning).
- Establish sustainable operational practices for multi-team usage and governance.
- Monitoring Alerting & Operations
- Set up monitoring and alerts for streaming messages and platform health using Prometheus & Grafana.
- Integrate New Relic dashboards/alerts to support operational visibility incident response and service health metrics.
- Performance Engineering
- Perform performance testing and tune Kafka/Confluent components for optimal throughput latency and stability.
- Troubleshoot complex production issues across brokers networking storage Connect and client workloads.
- Connectors & Data Integration
- Implement and support Confluent Oracle Premium CDC Connector (configuration offsets schema evolution error handling operations).
- Implement and support Confluent Sink Connector to ADLS2 (Azure Data Lake Storage Gen2) with reliable delivery and partitioning strategies.
- Streaming Development
- Build and support stream processing using Apache Flink (job configuration deployment patterns operationalization).
- Develop Kafka Streams applications (topology design state stores exactly-once/processing guarantees as needed).
- DevOps & Automation
- Use Azure DevOps with Git integration for version control reviews and change management.
- Deploy and manage cloud resources using Terraform and Ansible.
- Build and maintain CI/CD pipelines for platform configuration connectors and streaming jobs across environments.
- Cost Allocation
- Support chargeback/show back calculations for Kafka usage (e.g. throughput storage partitions connector/resource utilization) and related reporting.
Preferred Qualifications
- Experience implementing and testing Kafka DR cluster architectures and operational runbooks.
- Familiarity with enterprise governance patterns (multi-tenancy naming standards quotas schema governance).
- Experience defining usage metering to enable reliable chargeback/show back.
- Additional Expectations
- Proficiency with using Linux CLI (preferably RHEL/SLES).
- Ability to participate in an on-call rotation and provide timely incident support (if required).
- Strong documentation and stakeholder communication skills across engineering operations and product teams.
- Ability to create design patterns/templates provide development support and conduct knowledge transfer sessions for onboarding new use cases and modernizing existing ones.
Thanks & Regards
akhil
Job Role: Kafka Admin Job Location: Philadelphia PA (100% Onsite) Job Type: Contract Roles and Responsibilities 8 years of hands-on experience with Kafka/Confluent in production. Strong expertise with: SSL/TLS end-to-end configuration in Kafka ecosystems RBAC authorization configuration and o...
Job Role: Kafka Admin
Job Location: Philadelphia PA (100% Onsite)
Job Type: Contract
Roles and Responsibilities
- 8 years of hands-on experience with Kafka/Confluent in production.
- Strong expertise with:
- SSL/TLS end-to-end configuration in Kafka ecosystems
- RBAC authorization configuration and operational administration
- Designing for HA/redundancy and scaling for growth
- Monitoring/alerting with Prometheus & Grafana plus operational tooling such as New Relic
- Performance testing and tuning (producers/consumers brokers Connect infrastructure)
- Demonstrated experience implementing:
- Confluent Oracle Premium CDC Connector
- Confluent sink to ADLS Gen2 (ADLS2)
- Proficiency with Azure DevOps Git and building CI/CD pipelines.
- Working knowledge of Apache Flink and hands-on experience writing Kafka Streams.
Key Responsibilities:
- Security & Access Control
- Configure end-to-end SSL/TLS across Kafka/Confluent components and client integrations.
- Implement and manage RBAC for authorizations service accounts and least-privilege access.
- High Availability Redundancy & Failover
- Configure core components for redundancy and failover resilience (brokers/controllers Connect Schema Registry etc.).
- Design and implement a Kafka disaster recovery (DR) cluster including replication strategy failover testing and runbooks aligned to RPO/RTO.
- Scale & Future Growth
- Plan and implement platform scalability for future growth (topic/partition strategy retention throughput capacity planning).
- Establish sustainable operational practices for multi-team usage and governance.
- Monitoring Alerting & Operations
- Set up monitoring and alerts for streaming messages and platform health using Prometheus & Grafana.
- Integrate New Relic dashboards/alerts to support operational visibility incident response and service health metrics.
- Performance Engineering
- Perform performance testing and tune Kafka/Confluent components for optimal throughput latency and stability.
- Troubleshoot complex production issues across brokers networking storage Connect and client workloads.
- Connectors & Data Integration
- Implement and support Confluent Oracle Premium CDC Connector (configuration offsets schema evolution error handling operations).
- Implement and support Confluent Sink Connector to ADLS2 (Azure Data Lake Storage Gen2) with reliable delivery and partitioning strategies.
- Streaming Development
- Build and support stream processing using Apache Flink (job configuration deployment patterns operationalization).
- Develop Kafka Streams applications (topology design state stores exactly-once/processing guarantees as needed).
- DevOps & Automation
- Use Azure DevOps with Git integration for version control reviews and change management.
- Deploy and manage cloud resources using Terraform and Ansible.
- Build and maintain CI/CD pipelines for platform configuration connectors and streaming jobs across environments.
- Cost Allocation
- Support chargeback/show back calculations for Kafka usage (e.g. throughput storage partitions connector/resource utilization) and related reporting.
Preferred Qualifications
- Experience implementing and testing Kafka DR cluster architectures and operational runbooks.
- Familiarity with enterprise governance patterns (multi-tenancy naming standards quotas schema governance).
- Experience defining usage metering to enable reliable chargeback/show back.
- Additional Expectations
- Proficiency with using Linux CLI (preferably RHEL/SLES).
- Ability to participate in an on-call rotation and provide timely incident support (if required).
- Strong documentation and stakeholder communication skills across engineering operations and product teams.
- Ability to create design patterns/templates provide development support and conduct knowledge transfer sessions for onboarding new use cases and modernizing existing ones.
Thanks & Regards
akhil
View more
View less