DescriptionJoin Us at 55ip & Help the Wealth Management Industry Move Forward
Working at 55ip a separately branded subsidiary of J.P. Morgan means standing at the intersection of finance and technologyand at the cutting-edge of wealth and asset management. Weve been making rapid progress on our mission: to break down barriers to financial progress for financial advisors and their clients. Our Boston- New York- and Mumbai-based teams have built and brought to market a tax-smart investment strategy engine portfolio trading and rebalancing and advisor transition services all delivered through an intuitive experience and intelligent automation. Driven by strategic partnerships with world-class wealth and asset management firms weve experienced breakthrough growth over the last two years. Today over 400 financial advisor firms have trusted over $100 billion in assets under supervision with 55ip.
As 55ip continues its rapid growth trajectory we are on the path to managing over $1 trillion in assets under supervision (AUS) by 2030. This scale brings significant challenges and opportunities including supporting exponential increases in accounts net flows revenue and pre-tax income. Our technology platform must evolve to handle this growth ensuring robust performance reliability and automation for tax-smart investing portfolios across diverse product mixes and client segments.
Lead Data Engineer
55ips Quantitative Research & Development team is looking for a lead data engineer to take part in building its research data platform. The ideal candidate has a background in implementing effective large-scale data solutions in a cloud-based environment. The candidate will be motivated a problem solver a team player and an effective engineer looking to make a significant impact.
Responsibilities
- Build high-performance cloud-based data solutions to support quantitative research capabilities: back testing simulations machine learning and other advanced analytics & algorithms
- Implement schedule and oversee ETL ELT and monitoring processes
- Oversee the loading of data from vendor and internal data sources
- Implement and support automation solutions to improve platform scalability
- Monitor the system performance by performing regular tests troubleshooting and integrating new features
- Provide L2 support for data systems to stakeholder teams
- Collaborate with technical teams to design and implement data solutions
- Recommend solutions to improve new and existing data systems
Requirements
- Bachelors degree in information systems information technology computer science or similar.
- Expertise in Structured Query Language (SQL); experience with PostgreSQL specifically is a plus
- Expertise in Python Pandas/Polars and Numpy
- Clear understanding of OOP and software design constructs within a Pythoncontext
- Expertise with AWS Cloud services/technologies such as RDS S3 Lambda Secrets Manager
- Expertise in all aspects of software development lifecycle especially Agile Scrum
- Strong Experience implementing and monitoring ETL processes
- Strong Experience with database technologies architecture performance tuning and scaling
- Strong Experience in database design and modeling
- Experience with Airflow or similar orchestration tools
- Experience working with and CI/CD tools for application and database development
- Excellent communication (written and oral) and presentation skills
- Ability to work with quant researchers & developers to understand requirements
- Strong attention to detail pride in delivering high quality work and willingness to learn
- Agility: Able to shift gears and react quickly to timely requests
Preferred Qualifications
- Experience working with Data Lakes (Iceberg/DeltaTable)
- Experience working with DuckDB Spark and EMR
- Experience supporting development of AI/ML models
- Exposure to financial capital markets data and trading applications working with investment data
- AWS certifications are a plus
- Experience with project workflow tools such as Jira in an Agile-Scrum environment
- Knowledge of Linux
Required Experience:
IC
DescriptionJoin Us at 55ip & Help the Wealth Management Industry Move ForwardWorking at 55ip a separately branded subsidiary of J.P. Morgan means standing at the intersection of finance and technologyand at the cutting-edge of wealth and asset management. Weve been making rapid progress on our missi...
DescriptionJoin Us at 55ip & Help the Wealth Management Industry Move Forward
Working at 55ip a separately branded subsidiary of J.P. Morgan means standing at the intersection of finance and technologyand at the cutting-edge of wealth and asset management. Weve been making rapid progress on our mission: to break down barriers to financial progress for financial advisors and their clients. Our Boston- New York- and Mumbai-based teams have built and brought to market a tax-smart investment strategy engine portfolio trading and rebalancing and advisor transition services all delivered through an intuitive experience and intelligent automation. Driven by strategic partnerships with world-class wealth and asset management firms weve experienced breakthrough growth over the last two years. Today over 400 financial advisor firms have trusted over $100 billion in assets under supervision with 55ip.
As 55ip continues its rapid growth trajectory we are on the path to managing over $1 trillion in assets under supervision (AUS) by 2030. This scale brings significant challenges and opportunities including supporting exponential increases in accounts net flows revenue and pre-tax income. Our technology platform must evolve to handle this growth ensuring robust performance reliability and automation for tax-smart investing portfolios across diverse product mixes and client segments.
Lead Data Engineer
55ips Quantitative Research & Development team is looking for a lead data engineer to take part in building its research data platform. The ideal candidate has a background in implementing effective large-scale data solutions in a cloud-based environment. The candidate will be motivated a problem solver a team player and an effective engineer looking to make a significant impact.
Responsibilities
- Build high-performance cloud-based data solutions to support quantitative research capabilities: back testing simulations machine learning and other advanced analytics & algorithms
- Implement schedule and oversee ETL ELT and monitoring processes
- Oversee the loading of data from vendor and internal data sources
- Implement and support automation solutions to improve platform scalability
- Monitor the system performance by performing regular tests troubleshooting and integrating new features
- Provide L2 support for data systems to stakeholder teams
- Collaborate with technical teams to design and implement data solutions
- Recommend solutions to improve new and existing data systems
Requirements
- Bachelors degree in information systems information technology computer science or similar.
- Expertise in Structured Query Language (SQL); experience with PostgreSQL specifically is a plus
- Expertise in Python Pandas/Polars and Numpy
- Clear understanding of OOP and software design constructs within a Pythoncontext
- Expertise with AWS Cloud services/technologies such as RDS S3 Lambda Secrets Manager
- Expertise in all aspects of software development lifecycle especially Agile Scrum
- Strong Experience implementing and monitoring ETL processes
- Strong Experience with database technologies architecture performance tuning and scaling
- Strong Experience in database design and modeling
- Experience with Airflow or similar orchestration tools
- Experience working with and CI/CD tools for application and database development
- Excellent communication (written and oral) and presentation skills
- Ability to work with quant researchers & developers to understand requirements
- Strong attention to detail pride in delivering high quality work and willingness to learn
- Agility: Able to shift gears and react quickly to timely requests
Preferred Qualifications
- Experience working with Data Lakes (Iceberg/DeltaTable)
- Experience working with DuckDB Spark and EMR
- Experience supporting development of AI/ML models
- Exposure to financial capital markets data and trading applications working with investment data
- AWS certifications are a plus
- Experience with project workflow tools such as Jira in an Agile-Scrum environment
- Knowledge of Linux
Required Experience:
IC
View more
View less