We are seeking a seasoned Solutions Architect with 15 20 years of experience in designing and delivering enterprisegrade data and application solutions. The ideal candidate will have strong expertise across Data Engineering Cloud platforms (AWS) and fullstack development using Java (Spring Boot Microservices) and Python (including PySpark). This role demands a strategic mindset combined with handson capabilities to lead architectural efforts across cloudnative data and application ecosystems.
Key Responsibilities:
- Design and implement endtoend cloudnative solutions leveraging AWS Java and Python technologies.
- Lead architecture and development of data pipelines using PySpark and AWS Glue and backend systems using Java (Spring Boot) and Microservices.
- Collaborate with stakeholders to understand business requirements and translate them into scalable secure and highperformance technical solutions.
- Architect data lakes data warehouses and realtime streaming platforms on AWS using services like S3 Redshift Glue EMR Lambda Kinesis etc.
- Design and build microservicesbased architectures using Java Spring Boot and containerization technologies (Docker Kubernetes).
- Drive best practices for CI/CD DevOps and infrastructureascode (Terraform CloudFormation).
- Lead design reviews architecture evaluations and performance tuning across systems.
- Provide technical leadership mentorship and architectural governance to engineering teams.
- Ensure solutions adhere to security compliance scalability and costefficiency standards
Requirements
Required Skills and Experience:
- 15 20 years of experience in enterprise software architecture with deep expertise in data engineering cloud solutions and backend development.
- Strong handson skills in Python PySpark and Java (Spring Boot Microservices).
- Proven experience designing solutions on AWS (e.g. S3 Glue Lambda Redshift EMR Step Functions Kinesis).
- Strong understanding of ETL/ELT pipelines realtime data processing and streaming architectures.
- Experience in designing and deploying microservices and RESTful APIs using Java.
- Solid understanding of data modeling data partitioning performance tuning and security best practices.
- Experience with containerization (Docker) and orchestration platforms like Kubernetes.
- Strong knowledge of CI/CD tools DevOps pipelines and infrastructure automation.
- Excellent communication and interpersonal skills; ability to engage with technical and business stakeholders.
Benefits
Standard Company Benefits
AWS, Java, Pyspark
Education
Bachelor s or Master s degree in Computer Science, Engineering, Mathematics, or a related field.