About Us
Capco a Wipro company is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe we support 100 clients acrossbanking financial and Energy sectors. We are recognized for our deep transformation and delivery.
WHY JOIN CAPCO
You will work on engaging projects with the largest international and local banks insurance companies payment service providers and other key players in the industry. The projects that will transform the financial services industry.
MAKE AN IMPACT
Innovative thinking delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners we deliver disruptive work that is changing energy and financial services.
#BEYOURSELFATWORK
Capco has a tolerant open culture that values diversity inclusivity and creativity.
CAREER ADVANCEMENT
With no forced hierarchy at Capco everyone has the opportunity to grow as we grow taking their career into their own hands.
DIVERSITY & INCLUSION
We believe that diversity of people and perspective gives us a competitive advantage.
MAKE AN IMPACT
Job Title: Senior AWSData Engineering Lead/SME
Location:Pune / Bangalore (Preference Pune)
Job Type:FULL TIME
Experience Level:Senior
Job Summary:
We are looking for a highly experienced and motivatedSenior AWSData Engineering Leadto join our growing team. This role blends deep AWS data engineering expertise with strong infrastructure and technical leadership skills. Youll be responsible for designing and delivering scalable cloudnative data solutions while working closely with customers and internal teams. Ideal candidates are handson builders who excel in clean architecture automation and team coordination.
Key Responsibilities:
- This is a leadership role leading a growing team of AWS data engineers.
- Act at lead SME providing hands on guidance to AWS Data Engineering teams.
- Translate business and functional requirements intoAWSwellarchitected technical solutions.
- Own endtoend solution design architecture and delivery of data platforms and pipelines.
- Provide technical leadership to the team and drive adherence to best practices and standards.
- Design and implement robust scalable data pipelines using AWS services such asGlueS3LambdaIcebergAthenaSQS andEventBridge.
- Develop largescale data transformations usingPySparkandPython ensuring efficient processing and performance.
- Build infrastructure usingInfrastructure as Codetools likeTerraformandAWS CloudFormation.
- Implement and maintainCI/CD pipelineswith tools likeJenkinsGitLab CI/CD etc.
- UseGitfor version control and manage collaborative code development workflows.
- Work directly with customers to understand their needs and translate them into technical deliverables.
- Coordinate across teams to ensure smooth delivery timelines and system integrations.
Required Skills and Experience:
- 710 years of handson experience ofAWSdata engineering and solution delivery.
- Strong expertise with coreAWS servicesincluding Glue S3 Lambda Iceberg AthenaSQS andEventBridge.
- Advanced proficiency inPythonPySpark andSQLfor data processing and ETL workloads.
- Proven experience inETL/ELT pipeline development performance tuning and largescale data handling.
- Indepth knowledge ofeventdriven architecturesand AWS messaging services.
- Solid infrastructure knowledge and handson experience withTerraformand/orCloudFormation.
- Experience implementingCI/CD pipelinesusingJenkinsGitLab or similar tools.
- Familiarity withGitand collaborative version control workflows.
- Excellent communication skills with a proven ability to gather customer requirements and convert them into scalable technical solutions.
- Demonstrated experience leading and coordinating engineering teams in deliveryfocused environments.
Nice to Have:
- AWS Certification AWS WellArchitected AWS Solutions Architect AWS Data Analytics AWS DevOps).
- Experience working with modern data lakehouse patterns and optimized data formats (Parquet Avro etc..
- Familiarity with Docker ECS or EKS for containerized deployments.
- Exposure to monitoring and logging tools (e.g. CloudWatch Prometheus Grafana).