DescriptionWe have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible.
As a Lead Software Engineer at JPMorgan Chase within the Chief Technology Office - Identity and Access Management - Data Service Engineering team youare an integral part of an agile team that works to enhance build and deliver trusted market-leading technology products in a secure stable and scalable way. As a core technical contributor you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firms business objectives.
Job responsibilities
- Executes creative software solutions design development and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
- Architect and deliver high-throughput low-latency data pipelines on Databricks and Apache Spark (Core SQL Structured Streaming)
- Develops secure high-quality production code and reviews and debugs code written by others
- Define and implement Lakehouse patterns with Delta Lake (ACID schema evolution time travel Z-ordering compaction) to achieve performance at scale
- Manage Databricks compute and cluster configurations: runtime selection autoscaling driver/executor sizing Spark configs init-scripts cluster policies pools and instance profiles including orchestrate jobs with Databrick workflows to integrate with AWS
- Design secure data ingestion and transformation on AWS using S3 (data lake and lifecycle) Glue (catalog/ETL) IAM and Secrets Manager (RBAC/credentials) CloudWatch (logging/metrics/alerting) Lambda (serverless utilities) and Kinesis/Kafka/MSK (streaming)
- Implement data quality lineage and governance with Unity Catalog and/or Glue Catalog - embed expectations and validations within pipelines
- Optimize spark performance and cost via partitioning strategies file sizing AQE broadcast joins shuffle tuning caching spill/memory management and job right - sizing
- Leads evaluation sessions with external vendors startups and internal teams to drive outcomes-oriented probing of architectural designs technical credentials and applicability for use within existing systems and information architecture
- Collaborate with platform security and networking teams to enforce encryption network controls and least-privilege access; ensure compliance with organizational policies
- Add to team culture of diversity opportunity inclusion and respect
Required qualifications capabilities and skills
Preferred qualifications capabilities and skills
- Experience with Databricks: Delta Live Tables and advanced governance (catalogs grants auditing)
- Practical cloud native experience - using AWS networking and egress: VPC subnets routing security groups and data egress controls
- Analyze IaC CI/CD and test automation: Terraform for infrastructure deployments; Git workflows artifact management; testing frameworks (pytest JUnit).
- Knowledge in cost optimization with autoscaling strategies spot vs on-demand auto-termination storage layouts and compaction
- Drive real-time ingestion with Kafka/MSK or Kinesis Data Streams/Firehose
- Comprehend observability metrics lineage SLAs and alerting for data systems
- Experience in financial services or similarly regulated environments
Required Experience:
IC
DescriptionWe have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible.As a Lead Software Engineer at JPMorgan Chase within the Chief Technology Office - Identity and Access Management - Data Service Engineering team youare an integral part o...
DescriptionWe have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible.
As a Lead Software Engineer at JPMorgan Chase within the Chief Technology Office - Identity and Access Management - Data Service Engineering team youare an integral part of an agile team that works to enhance build and deliver trusted market-leading technology products in a secure stable and scalable way. As a core technical contributor you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firms business objectives.
Job responsibilities
- Executes creative software solutions design development and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
- Architect and deliver high-throughput low-latency data pipelines on Databricks and Apache Spark (Core SQL Structured Streaming)
- Develops secure high-quality production code and reviews and debugs code written by others
- Define and implement Lakehouse patterns with Delta Lake (ACID schema evolution time travel Z-ordering compaction) to achieve performance at scale
- Manage Databricks compute and cluster configurations: runtime selection autoscaling driver/executor sizing Spark configs init-scripts cluster policies pools and instance profiles including orchestrate jobs with Databrick workflows to integrate with AWS
- Design secure data ingestion and transformation on AWS using S3 (data lake and lifecycle) Glue (catalog/ETL) IAM and Secrets Manager (RBAC/credentials) CloudWatch (logging/metrics/alerting) Lambda (serverless utilities) and Kinesis/Kafka/MSK (streaming)
- Implement data quality lineage and governance with Unity Catalog and/or Glue Catalog - embed expectations and validations within pipelines
- Optimize spark performance and cost via partitioning strategies file sizing AQE broadcast joins shuffle tuning caching spill/memory management and job right - sizing
- Leads evaluation sessions with external vendors startups and internal teams to drive outcomes-oriented probing of architectural designs technical credentials and applicability for use within existing systems and information architecture
- Collaborate with platform security and networking teams to enforce encryption network controls and least-privilege access; ensure compliance with organizational policies
- Add to team culture of diversity opportunity inclusion and respect
Required qualifications capabilities and skills
Preferred qualifications capabilities and skills
- Experience with Databricks: Delta Live Tables and advanced governance (catalogs grants auditing)
- Practical cloud native experience - using AWS networking and egress: VPC subnets routing security groups and data egress controls
- Analyze IaC CI/CD and test automation: Terraform for infrastructure deployments; Git workflows artifact management; testing frameworks (pytest JUnit).
- Knowledge in cost optimization with autoscaling strategies spot vs on-demand auto-termination storage layouts and compaction
- Drive real-time ingestion with Kafka/MSK or Kinesis Data Streams/Firehose
- Comprehend observability metrics lineage SLAs and alerting for data systems
- Experience in financial services or similarly regulated environments
Required Experience:
IC
View more
View less