DescriptionWe have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible.
As a Lead Software Engineer-Cloud Data/Python/Java/Spark at JPMorgan Chase within the Consumer and Community Bankings Home Lending and Servicing team youwill play a crucial role in an agile team dedicated to enhancing building and delivering reliable market-leading technology products that are secure stable and scalable.
Job responsibilities
- Executes creative software solutions design development and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
- Develops secure high-quality production code and reviews and debugs code written by others
- Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems
- Leads evaluation sessions with external vendors startups and internal teams to drive outcomes-oriented probing of architectural designs technical credentials and applicability for use within existing systems and information architecture
- Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies
- Adds to team culture of diversity opportunity inclusion and respect
- Architect develop and deploy scalable data pipelines and solutions on AWS using services such as S3 Redshift Glue EMR Lambda and Athena
- Create and optimize data models build robust ETL processes and ensure efficient ingestation transformation and storage
- Monitor troubleshoot and tune data pipelines and cloud resources for optimal performance reliability and cost efficiency
- Maintain comprehensive documentation of data architectures processes and best practices; mentor junior engineers and share knowledge within the team
- Stay current with AWS advancements evaluate new services and tools and drive continuous improvement in cloud data engineering practices
Required qualifications capabilities and skills
- Formal training or certification on software engineering concepts and 5 years applied experience
- Hands-on practical experience delivering system design application development testing and operational stability
- Hands-on professional experience in one or more programming language(s) including Java or Python proficiency in Python SQL and at least one additional language (e.g. Java or Scala) for data engineering tasks
- Hands-on experience utilizing Apache Spark for large-scale data processing including developing and optimizing data pipelines performing real-time and batch analytics and leveraging Sparks libraries for machine learning and data transformation to drive actionable business insights.
- Experience with infrastructure automation tools such as AWS CloudFormation or Terraform
- Proficiency in all aspects of the Software Development Life Cycle
- Advanced understanding of agile methodologies such as CI/CD Application Resiliency and Security
- Demonstrated experience in system design including architecting scalable and reliable solutions selecting appropriate technologies defining system components and their interactions and ensuring alignment with business requirements and performance goals through detailed documentation and collaborative design reviews.
- Experience with AWS services related to data storage and management such as AWS Aurora DynamoDB S3 and RDS
- Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g. cloud artificial intelligence machine learning mobile etc.)
- AWS Certifications
Preferred qualifications capabilities and skills
- Extensive experience working with relational databases such as Oracle MySQL and SQL Server including designing schemas writing complex queries optimizing performance and ensuring data integrity for a variety of business applications.
- Snowflake experience is a plus
- Databricks experience is a plus