Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via email$ 126100 - 227950
1 Vacancy
The Digital Modernization Sector has an opening for a Kafka Cloud Architect to work in Woodlawn MD.
This position will require onsite work in Woodlawn MD five days a week
Day to Day Responsibilities:
Lead and organize a team of Kafka administrators and developers assign tasks and facilitate weekly Kafka Technical Review meetings with the team.
Work alongside customer to determine expanded use of Kafka within the Agency.
Strategize within Leidos to set up opportunities to explore new technologies to use with Kafka.
Architect design code and implement nextgeneration data streaming and eventbased architecture / platform on Confluent Kafka.
Define strategy for streaming data to data warehouse and integrating eventbased architect with microservice based applications.
Establish Kafka best practices and standards for implementing the Kafka platform based on identified use cases and required integration patterns.
Mentor existing team members by imparting expert knowledge to build a highperforming team in our eventdriven architecture. Assist developers in choosing correct patterns event modelling and ensuring data integrity.
Provide software expertise in one or more of these areas: application integration enterprise services serviceoriented architectures (SOA) security business process management/business rules processing data ingestion/data modeling.
Triage investigate advise in a handson capacity to resolve platform issues regardless of component.
Brief management customer team or vendors using written or oral skills at appropriate technical level for audience. Share uptodate insights on the latest Kafkabased solutions formulate creative approaches to address business challenges present and host workshops with senior leaders and translate technical jargons into laymans language and viceversa.
All other duties as assigned or directed.
Foundation for Success (Required Qualifications):
This experience is the foundation a candidate needs to be successful in this position:
Bachelors Degree in Computer Science Mathematics Engineering or a related field with 12 years of relevant experience OR Master degree with 10 years of relevant experience. Additional years of experience may be substituted/accepted in lieu of degree.
12 years of experience with modern software development including systems/application analysis and design.
7 years of combined experience with Kafka (One or more of the following: Confluent Kafka Apache Kafka and/or Amazon MSK).
2 years of combined experience with designing architecting and deploying to AWS cloud platform.
1 years of leading a technical team.
Must be able to obtain and maintain a Public Trust security clearance.
Factors to Help You Shine (Required Qualifications):
These skills will help you succeed in this position:
Expert experience with Confluent Kafka with handson production experience capacity planning installation administration / platform management and a deep understanding of the Kafka architecture and internals.
Expert Experience in Kafka cluster security disaster recovery data pipeline data replication and/or performance optimization.
Kafka installation & partitioning on OpenShift or Kubernetes topic management HA & SLA architecture.
Strong knowledge and application of microservice design principles and best practices: distributed systems bounded contexts servicetoservice integration patterns resiliency security networking and/or load balancing in large mission critical infrastructure.
Expert experience with Kafka Connect KStreams and KSQL with the ability to know how to use effectively for different use cases.
Handson experience with scaling Kafka infrastructure including Broker Connect ZooKeeper Schema Registry and/or Control Center.
Handson experience in designing writing and operationalizing new Kafka Connectors.
Solid experience with data serialization using Avro and JSON and data compression techniques.
Experience with AWS services such as ECS EKS Flink Amazon RDS for PostgreSQL and/or S3.
Basic knowledge of relational databases (PostgreSQL DB2 or Oracle) SQL and ORM technologies (JPA2 Hibernate and/or Spring JPA).
How to Stand Out from the Crowd (Desired Qualifications):
Showcase your knowledge of modern development using data streaming and eventbased architecture through the following experience or skills:
AWS cloud certifications.
Delivery (CI/CD) best practices and use of DevOps to accelerate quality releases to Production.
PaaS using Red Hat OpenShift/Kubernetes and Docker containers.
Experience with configuration management tools (Ansible CloudFormation / Terraform).
Solid experience with Spring Framework (Boot Batch Cloud Security and Data).
Solid knowledge with Java EE Java generics and concurrent programming.
Solid experience with automated unit testing TDD BDD and associated technologies (Junit Mockito Cucumber Selenium and Karma/Jasmine).
Working knowledge of opensource visualization platform Grafana and opensource monitoring system Prometheus and uses with Kafka.
For U.S. Positions: While subject to change based on business needs Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.
The Leidos pay range for this job level is a general guideline onlyand not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job education experience knowledge skills and abilities as well as internal equity alignment with market data applicable bargaining agreement (if any) or other law.
Full-Time