Please Note:
- Its 100% Onsite position.
- Selected candidate must be willing to work on-site in Woodlawn MD 5 days a week
Key Required Skills:
- Confluent Kafka Apache Flink Kafka Connect Python Java and SpringBoot.
Position Description:
- Lead and organize a team of Kafka administrators and developers assign tasks and facilitate weekly Kafka Technical Review meetings with the team.
- Work alongside customers to determine expanded use of Kafka within the Agency.
- Strategize within the organisation to set up opportunities to explore new technologies to use with Kafka.
- Architect design code and implement next-generation data streaming and event-based architecture / platform on Confluent Kafka.
- Define strategy for streaming data to data warehouse and integrating event-based architect with microservice based applications.
- Establish Kafka best practices and standards for implementing the Kafka platform based on identified use cases and required integration patterns.
- Mentor existing team members by imparting expert knowledge to build a high-performing team in our event-driven architecture. Assist developers in choosing correct patterns event modelling and ensuring data integrity.
- Provide software expertise in one or more of these areas: application integration enterprise services service-oriented architectures (SOA) security business process management/business rules processing data ingestion/data modeling.
- Triage investigate advise in a hands-on capacity to resolve platform issues regardless of component.
- Brief management customer team or vendors using written or oral skills at appropriate technical level for audience. Share up-to-date insights on the latest Kafka-based solutions formulate creative approaches to address business challenges present and host workshops with senior leaders and translate technical jargons into laymans language and vice-versa.
- All other duties as assigned or directed.
Requirements
Skills Requirements:
Basic Qualifications
- Bachelors Degree in Computer Science Mathematics Engineering or a related field with 12 years of relevant experience OR Master degree with 10 years of relevant experience. Additional years of experience may be substituted/accepted in lieu of degree.
- 12 years of experience with modern software development including systems/application analysis and design.
- 7 years of combined experience with Kafka (Confluent Kafka and/or Apache Kafka).
- 2 years of combined experience with designing architecting and deploying to AWS cloud platform.
- 1 years of leading a technical team.
- Must be able to obtain and maintain a Public Trust security clearance.
Required Skills
- Expert experience with Confluent Kafka with hands-on production experience capacity planning installation administration / platform management and a deep understanding of the Kafka architecture and internals.
- Expert in Kafka cluster and application security.
- Strong knowledge of Event Driven Architecture (EDA)
- Expert Experience in data pipeline data replication and/or performance optimization.
- Kafka installation & partitioning on OpenShift or Kubernetes topic management HA & SLA architecture.
- Strong knowledge and application of microservice design principles and best practices: distributed systems bounded contexts service-to-service integration patterns resiliency security networking and/or load balancing in large mission critical infrastructure.
- Expert experience with Kafka Connect KStreams and KSQL with the ability to know how to use effectively for different use cases.
- Hands-on experience with scaling Kafka infrastructure including Broker Connect ZooKeeper Schema Registry and/or Control Center.
- Hands-on experience in designing writing and operationalizing new Kafka Connectors.
- Solid experience with data serialization using Avro and JSON and data compression techniques.
- Experience with AWS services such as ECS EKS Flink Amazon RDS for PostgreSQL and/or S3.
- Basic knowledge of relational databases (PostgreSQL DB2 or Oracle) SQL and ORM technologies (JPA2 Hibernate and/or Spring JPA).
Desired Skill
- Disaster recovery strategy
- Domain Driven Design (DDD)
- AWS cloud certifications.
- Delivery (CI/CD) best practices and use of DevOps to accelerate quality releases to Production.
- PaaS using Red Hat OpenShift/Kubernetes and Docker containers.
- Experience with configuration management tools (Ansible CloudFormation / Terraform).
- Solid experience with Spring Framework (Boot Batch Cloud Security and Data).
- Solid knowledge with Java EE Java generics and concurrent programming.
- Solid experience with automated unit testing TDD BDD and associated technologies (Junit Mockito Cucumber Selenium and Karma/Jasmine).
- Working knowledge of open-source visualization platform Grafana and open-source monitoring system Prometheus and uses with Kafka.
Confluent Kafka, DB2, JSON , Hibernate
Education
Bachelor s degree in Computer Science, Information Technology, or related field.