Enterprise Architect (Kafka)
McLean VA (Onsite) Contract $70/hour
Architect implement and optimize enterprise-grade data streaming solutions using Apache Kafka and Confluent Platform for large-scale real-time environments.
A leading technology services organization is seeking an experienced Enterprise Architect (Kafka) to design deploy and manage high-performance event-driven data architectures. The ideal candidate will bring deep expertise in the Kafka ecosystem Confluent Cloud and cloud-based integrations with a proven ability to lead migration and optimization initiatives across complex enterprise systems.
This is a 100% onsite position based in McLean VA
Position Overview
The Enterprise Architect will lead the design and delivery of scalable Kafka-based data solutions that support real-time processing analytics and system integration across multiple business platforms. This role requires extensive hands-on experience in Kafka infrastructure advanced Confluent components and collaboration with cross-functional engineering and DevOps teams to ensure security resilience and performance.
Key Responsibilities
Architect and deploy Kafka-based data streaming solutions using Confluent Platform and Confluent Cloud.
Design Kafka clusters brokers topics and partitions optimized for performance scalability and reliability.
Lead migration efforts from legacy messaging systems (e.g. IBM MQ TIBCO) to Kafka-based frameworks.
Develop and fine-tune Kafka Streams Kafka Connect ksqlDB and Flink pipelines.
Implement Kafka security including RBAC ACLs encryption and multi-cluster access control.
Integrate Kafka with AWS GCP and Azure services as well as CI/CD and data management tools.
Collaborate with engineering data and application teams on enterprise integration initiatives.
Monitor and optimize Kafka environments using Grafana Prometheus and Confluent Control Center.
Provide architectural governance and best practices for event-driven application design.
Requirements
Required Qualifications
Bachelors degree in Computer Science Information Systems or related field.
10 years of experience in software or data architecture with deep focus on Kafka and Confluent Platform.
Proficiency in Apache Kafka Confluent Cloud Kafka Connect Kafka Streams ksqlDB Zookeeper and KRaft.
Experience integrating Kafka with cloud platforms (AWS Azure GCP) and CI/CD pipelines.
Familiarity with scripting (Shell Python) and Infrastructure-as-Code tools (Terraform).
Understanding of data integration frameworks such as Snowflake Databricks and Hadoop.
Excellent analytical and problem-solving skills with strong communication abilities.
Preferred Experience & Skills
Background in financial services or credit union technology environments.
Experience with Salesforce Community Cloud or other enterprise integration tools.
Strong grasp of application integration architecture and distributed data systems.
Required Skills:
Proven expertise in HPC Platform architecture and administration. Strong proficiency in Linux system administration and Azure Fundamentals. Hands-on experience with Slurm workload management and Kubernetes orchestration. Experience managing or supporting Posit Workbench and related tools. Excellent analytical and problem-solving skills for diagnosing complex technical issues. Strong communication and collaboration skills to work across multi-functional teams.
Required Education:
Bachelors degree in Computer Science Engineering or a related field.
Enterprise Architect (Kafka)McLean VA (Onsite) Contract $70/hour Architect implement and optimize enterprise-grade data streaming solutions using Apache Kafka and Confluent Platform for large-scale real-time environments.A leading technology services organization is seeking an experienced Enterpri...
Enterprise Architect (Kafka)
McLean VA (Onsite) Contract $70/hour
Architect implement and optimize enterprise-grade data streaming solutions using Apache Kafka and Confluent Platform for large-scale real-time environments.
A leading technology services organization is seeking an experienced Enterprise Architect (Kafka) to design deploy and manage high-performance event-driven data architectures. The ideal candidate will bring deep expertise in the Kafka ecosystem Confluent Cloud and cloud-based integrations with a proven ability to lead migration and optimization initiatives across complex enterprise systems.
This is a 100% onsite position based in McLean VA
Position Overview
The Enterprise Architect will lead the design and delivery of scalable Kafka-based data solutions that support real-time processing analytics and system integration across multiple business platforms. This role requires extensive hands-on experience in Kafka infrastructure advanced Confluent components and collaboration with cross-functional engineering and DevOps teams to ensure security resilience and performance.
Key Responsibilities
Architect and deploy Kafka-based data streaming solutions using Confluent Platform and Confluent Cloud.
Design Kafka clusters brokers topics and partitions optimized for performance scalability and reliability.
Lead migration efforts from legacy messaging systems (e.g. IBM MQ TIBCO) to Kafka-based frameworks.
Develop and fine-tune Kafka Streams Kafka Connect ksqlDB and Flink pipelines.
Implement Kafka security including RBAC ACLs encryption and multi-cluster access control.
Integrate Kafka with AWS GCP and Azure services as well as CI/CD and data management tools.
Collaborate with engineering data and application teams on enterprise integration initiatives.
Monitor and optimize Kafka environments using Grafana Prometheus and Confluent Control Center.
Provide architectural governance and best practices for event-driven application design.
Requirements
Required Qualifications
Bachelors degree in Computer Science Information Systems or related field.
10 years of experience in software or data architecture with deep focus on Kafka and Confluent Platform.
Proficiency in Apache Kafka Confluent Cloud Kafka Connect Kafka Streams ksqlDB Zookeeper and KRaft.
Experience integrating Kafka with cloud platforms (AWS Azure GCP) and CI/CD pipelines.
Familiarity with scripting (Shell Python) and Infrastructure-as-Code tools (Terraform).
Understanding of data integration frameworks such as Snowflake Databricks and Hadoop.
Excellent analytical and problem-solving skills with strong communication abilities.
Preferred Experience & Skills
Background in financial services or credit union technology environments.
Experience with Salesforce Community Cloud or other enterprise integration tools.
Strong grasp of application integration architecture and distributed data systems.
Required Skills:
Proven expertise in HPC Platform architecture and administration. Strong proficiency in Linux system administration and Azure Fundamentals. Hands-on experience with Slurm workload management and Kubernetes orchestration. Experience managing or supporting Posit Workbench and related tools. Excellent analytical and problem-solving skills for diagnosing complex technical issues. Strong communication and collaboration skills to work across multi-functional teams.
Required Education:
Bachelors degree in Computer Science Engineering or a related field.
View more
View less