Leidos is seeking a Software Architect with Kafka to be part of the mission solution and help lead SSAs Digital Modernization Strategy. Join one of our high performing teams responsible for building the next-generation enterprise APIs and modern responsive user interfaces supporting the Social Security Administration (SSA) and their mission to meet the changing needs of the public positively impacting at least 65 million American lives per month. We are a team of forward-looking professionals in need of a strong candidate with these key required skills: Kafka Architecture Event Driven Architecture (EDA) Data Streaming (Apache Flink Kafka Streams ksqlDB Kafka Connect real-time data processing) and Modern Development (Java with SpringBoot or Python). (max 255 characters)
If this sounds like a mission you want to be a part of keep reading!
TEAM CULTURE
Your passion and values might be a good fit for our teams if you answer yes to the following questions:
- Are you looking for a company that puts employees first with a focus on career flexibility and well-being
- Do you enjoy collaborating with colleagues and teammates and believe that the best ideas are fostered in an inclusive environment
- Are you searching for a team with a strong sense of ownership urgency and drive for daily mission success
- Are you comfortable with proactive outward communication and technical leadership
- Do you enjoy being a catalyst solving complex problems and providing innovative solutions
- Do you have the flexibility creativity and resilience to pivot the mission for success
- Do you have the courage to make tough ethical decisions with pride transparency and respect
MENTORSHIP & CAREER GROWTH
Our teams are dedicated to supporting new team members in an environment that celebrates knowledge sharing and mentorship. Experienced team members will be assigned to new hires for one-on-one mentoring collaborative reviews and coaching on customer engagement to help each new hire successfully onboard and demonstrate their skills. Projects and tasks are assigned in a way that leverages your strengths and will help you further develop your skillset.
DAY TO DAY RESPONSIBILITIES
Every position we take is more rewarding when you know the why behind your work makes a difference to support those who need it most. If your passion is enabling life-changing service to those around you this is the place for you. Find you passion in a team environment where all members are valued regardless of contractor or employee status. Find your Why with us and take your place in our Leidos Family!!
- Architect design code and implement next-generation data streaming and event-based architecture / platform using system/software engineering best practices and the latest technologies: Confluent Kafka Apache Flink Kafka Connect and modern software development.
- Work alongside customers to determine expanded use of Kafka within the Agency.
- Strategize within Leidos to set up opportunities to explore new technologies to use with Kafka.
- Define strategy for streaming data to data warehouse and integrating event-based architect with microservice based applications.
- Establish Kafka best practices and standards for implementing the Kafka platform based on identified use cases and required integration patterns.
- Mentor existing team members by imparting expert knowledge to build a high-performing team in our event-driven architecture. Assist developers in choosing correct patterns event modelling and ensuring data integrity.
- Provide software expertise in one or more of these areas: application integration enterprise services service-oriented architecture (SOA) security business process management/business rules processing data streaming/event driven design or data ingestion/data modeling.
- Triage investigate advise in a hands-on capacity to resolve platform issues regardless of component.
- Influence development of data stream solutions that impact strategic project/program goals and business results. Recommend and develop new technical solutions products and/or standards in support of functions strategy and operations. Lead and manage works of other technical staff that has significant impact on project results/outputs.
- Brief management customer team or vendors using written or oral skills at appropriate technical level for audience.
- All other duties as assigned or directed.
FOUNDATION FOR SUCCESS
- Must be able to obtain and maintain a Public Trust. Contract requirement.
- Bachelors degree in computer science mathematics engineering or a related field with 12 years of relevant experience or masters degree with 10 years of relevant experience. Additional years of experience may be substituted/accepted in lieu of degree.
- 12 years of experience with modern software development including systems/application analysis design and implementation.
- 3 years of hands-on production experience with Kafka (Confluent Kafka and/or Apache Kafka) data streaming and Event Driven Architecture (EDA) including practical expertise in designing and implementing solutions.
- 2 years of combined experience with designing architecting and deploying scalable solutions on AWS cloud platform including infrastructure automation and optimization.
- 1 years of leading a technical team.
*** Selected candidate must be willing to work on-site in Woodlawn MD 5 days a week.
FACTORS TO HELP YOU SHINE (Required Skills)
These skills will help you succeed in this position:
- Expert experience with Confluent Kafka with hands-on production experience capacity planning installation administration/platform management and a deep understanding of Kafka architecture and internals (e.g. brokers producers consumers partitions etc.).
- Expert in Kafka cluster and application security including authentication authorization and encryption.
- Strong knowledge and experience with Event Driven Architecture (EDA) including designing and implementing EDA solutions for large-scale systems.
- Expert experience in data pipeline data replication and/or performance optimization with real-time analytics and data pipeline optimization.
- Kafka installation & partitioning on OpenShift or Kubernetes topic management and oversight of Kafka infrastructure to ensure High Availability (HA) solutions are architected and implemented correctly.
- Strong knowledge and application of microservice design principles and best practices including distributed systems bounded contexts service-to-service integration patterns resiliency security networking API design and management and/or load balancing in large mission-critical infrastructure.
- Expert experience with Kafka Connect KStreams and KSQL with the ability to use them effectively for different use cases including designing writing and operationalizing new Kafka Connectors.
- Solid experience with data serialization using Avro and JSON and data compression techniques.
- Experience with AWS services such as ECS EKS Flink Amazon RDS for PostgreSQL S3 and serverless computing (e.g. AWS Lambda) for event-driven workflows.
- Basic knowledge of relational databases (PostgreSQL DB2 or Oracle) SQL and ORM technologies (JPA2 Hibernate and/or Spring JPA).
- Experience with monitoring and troubleshooting Kafka performance issues including setting up dashboards and alerts for Kafka metrics using open-source visualization platforms like Grafana and monitoring systems like Prometheus.
HOW TO STAND OUT FROM THE CROWD (Desired Skills)
Showcase your knowledge of modern development through the following experience or skills:
- Creating disaster recovery strategy including designing and implementing disaster recovery solutions for Kafka and related systems.
- Experience with Domain Driven Design (DDD) including implementing DDD in large-scale distributed systems.
- AWS cloud certifications demonstrating expertise in architecting and deploying solutions on AWS.
- Delivery (CI/CD) best practices including experience with GitOps workflows and using DevOps to accelerate quality releases to production.
- PaaS using Red Hat OpenShift/Kubernetes and Docker containers with hands-on experience in containerization and orchestration.
- Experience with configuration management tools such as Ansible CloudFormation and/or Terraform for infrastructure automation.
- Solid experience with Spring Framework including Spring Boot Batch Cloud Security and Data.
- Solid knowledge with Java EE Java generics and concurrent programming.
- Solid experience with automated unit testing TDD BDD and associated technologies (e.g. JUnit Mockito Cucumber Selenium and Karma/Jasmine).
- Working knowledge of open-source visualization platform Grafana and open-source monitoring system Prometheus including their use with Kafka for monitoring and alerting.
Are you an US citizen or a permanent U.S. resident and think you might fit We recommend you apply and start the conversation today! Join us in supporting our SSA contracts in Woodlawn Maryland.
ITSSCII
If youre looking for comfort keep scrolling. At Leidos we outthink outbuild and outpace the status quo because the mission demands it. Were not hiring followers. Were recruiting the ones who disrupt provoke and refuse to fail. Step 10 is ancient history. Were already at step 30 and moving faster than anyone else dares.
Original Posting:
February 13 2026
For U.S. Positions: While subject to change based on business needs Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.
Pay Range:
Pay Range $131300.00 - $237350.00
The Leidos pay range for this job level is a general guideline onlyand not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job education experience knowledge skills and abilities as well as internal equity alignment with market data applicable bargaining agreement (if any) or other law.