Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailNot Disclosed
Salary Not Disclosed
1 Vacancy
Job Overview:
We are seeking a skilled Python Developer with hands-on experience in Databricks and Kafka to join our technology team.
The ideal candidate will design develop and optimize large-scale data processing pipelines and real-time data streaming solutions to support our trading risk and compliance functions.
You will collaborate with business stakeholders and data teams to deliver high-performance data solutions in a fast-paced financial environment.
Responsibilities:
Develop test and maintain scalable ETL/ELT data pipelines using Python PySpark and Databricks on cloud platforms.
Build and manage real-time data streaming solutions with Kafka to support low-latency data feeds.
Collaborate with quantitative analysts traders and risk managers to understand data requirements and deliver effective solutions.
Optimize existing data workflows for performance reliability and efficiency.
Implement data quality checks and monitoring mechanisms.
Participate in code reviews documentation and knowledge sharing within the team.
Ensure compliance with financial data governance and security standards.
Stay updated with emerging technologies and propose innovative solutions for data processing challenges.
Required Skills & Qualifications:
10 years of experience in Python development
Strong experience with Databricks platform and cloud-based data engineering.
Proven expertise in Kafka for building scalable real-time streaming applications.
Knowledge of relational and NoSQL databases (e.g. SQL Cassandra MongoDB).
Familiarity with investment banking processes trading systems risk management or financial data workflows.
Good understanding of distributed computing concepts and big data ecosystem.
Experience with version control systems (e.g. Git) and Agile development methodologies.
Excellent problem-solving skills attention to detail and ability to work under tight deadlines.
Preferred Qualifications:
Experience with other big data tools such as Hadoop Spark SQL or Flink.
Knowledge of financial data standards and regulations.
Certification in Cloud platforms (AWS Azure GCP).
Previous experience working in a regulated financial environment.
Full-time