About our client:
With a focus on Africa China the UK and the US this global investment firm offers you an opportunity to be involved with a unique approach to responsible investing whilst driving performance and innovation. This client is guided by a philosophy of investing with care and seeks out top achievers who are looking to redefine and shape the future of the investment industry. Our client holds a firm belief that their work goes beyond financial gains and numbers recognising the impact of their actions on the lives and futures of those they serve. With a specialised focus on emerging markets and a passion for Africa and China our client offers independent advice analysis and reporting services to a diverse range of asset owners investment managers hedge funds private equity firms service providers and brokers.
About the role:
Our client is looking for a keen and motivated Graduate Data Engineer to join their Investment Analytics & Reporting crew. Youll be right in the middle of things working with the analytics development and investment teams to make sure data flows perfectly for all their performance attribution exposure and risk reports across their global multi-asset portfolios. Youll get to tackle data issues automate workflows build slick data pipelines boost data quality and get hands-on experience with cool tech like Python SQL Databricks and various cloud platforms.
What you will be doing:
- Ensure the accurate timely and reliable flow of investment data meeting daily and monthly SLAs. This includes supporting the full data lifecycle - ingestion validation transformation and delivery - and monitoring running and validating daily ETL processes across key domains (pricing holdings transactions) troubleshooting data breaks and implementing timely fixes.
- Design and contribute to modular scalable data pipelines participate in architecture discussions for new investment-data workflows and partner with development teams on automation rule refinement and data model logic. Support the migration of legacy processes to modern platforms (Databricks / Delta Lake) and build scripts/tools to reduce manual work.
- Strengthen operational resilience improve data quality (via rules-based checks) and drive automation/optimization across investment analytics. Support data lineage tracking metadata management and governance for audits and maintain comprehensive documentation of data flows and dependencies.
- Work with investment analytics to support reporting cycles and bespoke client deliverables and collaborate with external vendors (fund administrators prime brokers) using APIs and file feeds.
- Develop a strong understanding of investment data financial instruments portfolio holdings return calculations and risk metrics.
What our client is looking for:
- A relevant tertiary degree (Engineering Computer Science Applied Mathematics Statistics etc.)
- Understanding of ETL/data structures coding ability (Python & SQL preferred) strong Excel skills and familiarity with orchestration tools (e.g. Airflow Azure Data Factory).
- Strong attention to detail analytical thinking a curious mindset with a passion for creative problem-solving and a willingness to learn and embrace emerging technologies (automation & AI).
- Proactive organised able to prioritise multiple tasks work under pressure during reporting cycles and comfortable working with large datasets/technical workflows.
- Good communication skills (especially with dev/analytics teams) and the ability to document pipelines data contracts and workflows for reproducibility and team resilience.
- Ability to balance structured analysis with open-ended exploration.
Job ID:
Required Skills:
Graduate Data Engineer Python SQL ETL Data Pipelines Databricks Investment Analytics Automation Data Quality
About our client:With a focus on Africa China the UK and the US this global investment firm offers you an opportunity to be involved with a unique approach to responsible investing whilst driving performance and innovation. This client is guided by a philosophy of investing with care and seeks out t...
About our client:
With a focus on Africa China the UK and the US this global investment firm offers you an opportunity to be involved with a unique approach to responsible investing whilst driving performance and innovation. This client is guided by a philosophy of investing with care and seeks out top achievers who are looking to redefine and shape the future of the investment industry. Our client holds a firm belief that their work goes beyond financial gains and numbers recognising the impact of their actions on the lives and futures of those they serve. With a specialised focus on emerging markets and a passion for Africa and China our client offers independent advice analysis and reporting services to a diverse range of asset owners investment managers hedge funds private equity firms service providers and brokers.
About the role:
Our client is looking for a keen and motivated Graduate Data Engineer to join their Investment Analytics & Reporting crew. Youll be right in the middle of things working with the analytics development and investment teams to make sure data flows perfectly for all their performance attribution exposure and risk reports across their global multi-asset portfolios. Youll get to tackle data issues automate workflows build slick data pipelines boost data quality and get hands-on experience with cool tech like Python SQL Databricks and various cloud platforms.
What you will be doing:
- Ensure the accurate timely and reliable flow of investment data meeting daily and monthly SLAs. This includes supporting the full data lifecycle - ingestion validation transformation and delivery - and monitoring running and validating daily ETL processes across key domains (pricing holdings transactions) troubleshooting data breaks and implementing timely fixes.
- Design and contribute to modular scalable data pipelines participate in architecture discussions for new investment-data workflows and partner with development teams on automation rule refinement and data model logic. Support the migration of legacy processes to modern platforms (Databricks / Delta Lake) and build scripts/tools to reduce manual work.
- Strengthen operational resilience improve data quality (via rules-based checks) and drive automation/optimization across investment analytics. Support data lineage tracking metadata management and governance for audits and maintain comprehensive documentation of data flows and dependencies.
- Work with investment analytics to support reporting cycles and bespoke client deliverables and collaborate with external vendors (fund administrators prime brokers) using APIs and file feeds.
- Develop a strong understanding of investment data financial instruments portfolio holdings return calculations and risk metrics.
What our client is looking for:
- A relevant tertiary degree (Engineering Computer Science Applied Mathematics Statistics etc.)
- Understanding of ETL/data structures coding ability (Python & SQL preferred) strong Excel skills and familiarity with orchestration tools (e.g. Airflow Azure Data Factory).
- Strong attention to detail analytical thinking a curious mindset with a passion for creative problem-solving and a willingness to learn and embrace emerging technologies (automation & AI).
- Proactive organised able to prioritise multiple tasks work under pressure during reporting cycles and comfortable working with large datasets/technical workflows.
- Good communication skills (especially with dev/analytics teams) and the ability to document pipelines data contracts and workflows for reproducibility and team resilience.
- Ability to balance structured analysis with open-ended exploration.
Job ID:
Required Skills:
Graduate Data Engineer Python SQL ETL Data Pipelines Databricks Investment Analytics Automation Data Quality
View more
View less