DescriptionYoure ready to gain the skills and experience needed to grow within your role and advance your career and we have the perfect software engineering opportunity for you.
As a Software Engineer II - Data Engineer - Spark Python Databricks or AWS EMR at JPMorgan Chase within the Commercial & Investment Bank youll be a part of an agile team that works to enhance design and deliver the software components of the firms state-of-the-art technology products in a secure stable and scalable way. As an emerging member of a software engineering team you execute software solutions through the design development and technical troubleshooting of multiple components within a technical product application or system while gaining the skills and experience needed to grow within your role.
Job responsibilities
- Design develop and maintain scalable data pipelines and ETL processes.
- Work with large datasets using Spark on Databricks or AWS EMR.
- Write efficient SQL queries for data extraction transformation and analysis.
- Collaborate with data scientists analysts and other engineering teams to deliver high-quality data solutions.
- Implement data processing workflows on AWS services such as S3 ECS Lambda EMR and Glue.
- Develop and maintain Python scripts for data processing and automation.
- Ensure data quality integrity and security across all data engineering activities.
- Troubleshoot and resolve data-related issues in a timely manner.
Required qualifications capabilities and skills
- Formal training or certification on software engineering concepts and 2 years applied experience
- Proven expertise in Data Engineering with Spark.
- Hands-on experience with Databricks or AWS EMR.
- Strong knowledge of SQL and database concepts.
- Experience in ETL and data processing workflows.
- Proficiency in AWS services: S3 ECS Lambda EMR/Glue.
- Advanced skills in Python programming.
- Excellent problem-solving and analytical abilities.
- Bachelors degree in Computer Science Information Technology or related field (or equivalent experience).
- Strong communication and collaboration skills.
- Ability to work independently and as part of a team.
Preferred qualifications capabilities and skills
- Experience with Infrastructure as Code (IaaC) using Terraform or CloudFormation.
- Familiarity with writing unit test cases for Python code.
- Knowledge of version control systems such as BitBucket or GitHub.
- Understanding of CI/CD pipelines and automation tools.
Required Experience:
IC
DescriptionYoure ready to gain the skills and experience needed to grow within your role and advance your career and we have the perfect software engineering opportunity for you.As a Software Engineer II - Data Engineer - Spark Python Databricks or AWS EMR at JPMorgan Chase within the Commercial & ...
DescriptionYoure ready to gain the skills and experience needed to grow within your role and advance your career and we have the perfect software engineering opportunity for you.
As a Software Engineer II - Data Engineer - Spark Python Databricks or AWS EMR at JPMorgan Chase within the Commercial & Investment Bank youll be a part of an agile team that works to enhance design and deliver the software components of the firms state-of-the-art technology products in a secure stable and scalable way. As an emerging member of a software engineering team you execute software solutions through the design development and technical troubleshooting of multiple components within a technical product application or system while gaining the skills and experience needed to grow within your role.
Job responsibilities
- Design develop and maintain scalable data pipelines and ETL processes.
- Work with large datasets using Spark on Databricks or AWS EMR.
- Write efficient SQL queries for data extraction transformation and analysis.
- Collaborate with data scientists analysts and other engineering teams to deliver high-quality data solutions.
- Implement data processing workflows on AWS services such as S3 ECS Lambda EMR and Glue.
- Develop and maintain Python scripts for data processing and automation.
- Ensure data quality integrity and security across all data engineering activities.
- Troubleshoot and resolve data-related issues in a timely manner.
Required qualifications capabilities and skills
- Formal training or certification on software engineering concepts and 2 years applied experience
- Proven expertise in Data Engineering with Spark.
- Hands-on experience with Databricks or AWS EMR.
- Strong knowledge of SQL and database concepts.
- Experience in ETL and data processing workflows.
- Proficiency in AWS services: S3 ECS Lambda EMR/Glue.
- Advanced skills in Python programming.
- Excellent problem-solving and analytical abilities.
- Bachelors degree in Computer Science Information Technology or related field (or equivalent experience).
- Strong communication and collaboration skills.
- Ability to work independently and as part of a team.
Preferred qualifications capabilities and skills
- Experience with Infrastructure as Code (IaaC) using Terraform or CloudFormation.
- Familiarity with writing unit test cases for Python code.
- Knowledge of version control systems such as BitBucket or GitHub.
- Understanding of CI/CD pipelines and automation tools.
Required Experience:
IC
View more
View less