Contract & Fulltime Opportunity Available - No Visa Sponsorship options for this role!
Hybrid Work 2-3 days work from client office in Downtown Toronto OR Mississauga**
KDB/q Developer
Our client is seeking a hands-on developer with experience in KDB/q to support real-time and historical data processing needs within its Markets Technology group. This role focuses on building high-performance analytics platforms to support trading surveillance market data and regulatory reporting using in-house and open-source Flink deployments (not Confluent).
Key Responsibilities
- Design implement and maintain streaming data pipelines using Apache Flink (open-source non-Confluent) to process high-volume trade and market data.
- Develop and optimize KDB/q applications for storing and analysing tick data order books and market microstructure analytics.
- Collaborate with traders quants and risk teams to deliver solutions for real-time decision support and post-trade analysis.
- Integrate Flink-based services with Kafka internal APIs and downstream data consumers.
- Support internal infrastructure teams on deployment monitoring and tuning of clusters.
Required Qualifications & Experience
- 3 years of experience in KDB/q development for real-time and historical analytics platforms.
- 2 years of experience in queryoptimizationschemadesign forlargetime-seriesdatasets real-timedataprocessingmemoryand CPUtuning
- Strong understanding of trading workflows market data feeds and risk systems in investment banking
- Experience with Kafka for stream ingestion and distribution.
- Familiarity with low-latency system design performance tuning and distributed computing.
- Strong communication skills and ability to work with global teams across Markets and Risk.
- Solid understanding of Agile methodologies and CI/CD processes.
- Strong problem-solving skillswith the ability to prioritize multiple tasks set goals and meet deadlines.
- Excellent communication skills capable of articulating complex technical concepts in a multicultural team environment.
- Bachelors degree in Computer Science Engineering or a related field (or equivalent experience).
- Prior experience in Equities FX or Fixed Income trading technology.
- Familiarity with CI/CD tools and infrastructure-as-code frameworks (e.g. Ansible Terraform).
Contract & Fulltime Opportunity Available - No Visa Sponsorship options for this role! Hybrid Work 2-3 days work from client office in Downtown Toronto OR Mississauga**KDB/q DeveloperOur client is seeking a hands-on developer with experience in KDB/q to support real-time and historical data processi...
Contract & Fulltime Opportunity Available - No Visa Sponsorship options for this role!
Hybrid Work 2-3 days work from client office in Downtown Toronto OR Mississauga**
KDB/q Developer
Our client is seeking a hands-on developer with experience in KDB/q to support real-time and historical data processing needs within its Markets Technology group. This role focuses on building high-performance analytics platforms to support trading surveillance market data and regulatory reporting using in-house and open-source Flink deployments (not Confluent).
Key Responsibilities
- Design implement and maintain streaming data pipelines using Apache Flink (open-source non-Confluent) to process high-volume trade and market data.
- Develop and optimize KDB/q applications for storing and analysing tick data order books and market microstructure analytics.
- Collaborate with traders quants and risk teams to deliver solutions for real-time decision support and post-trade analysis.
- Integrate Flink-based services with Kafka internal APIs and downstream data consumers.
- Support internal infrastructure teams on deployment monitoring and tuning of clusters.
Required Qualifications & Experience
- 3 years of experience in KDB/q development for real-time and historical analytics platforms.
- 2 years of experience in queryoptimizationschemadesign forlargetime-seriesdatasets real-timedataprocessingmemoryand CPUtuning
- Strong understanding of trading workflows market data feeds and risk systems in investment banking
- Experience with Kafka for stream ingestion and distribution.
- Familiarity with low-latency system design performance tuning and distributed computing.
- Strong communication skills and ability to work with global teams across Markets and Risk.
- Solid understanding of Agile methodologies and CI/CD processes.
- Strong problem-solving skillswith the ability to prioritize multiple tasks set goals and meet deadlines.
- Excellent communication skills capable of articulating complex technical concepts in a multicultural team environment.
- Bachelors degree in Computer Science Engineering or a related field (or equivalent experience).
- Prior experience in Equities FX or Fixed Income trading technology.
- Familiarity with CI/CD tools and infrastructure-as-code frameworks (e.g. Ansible Terraform).
View more
View less