About Citco:
Since the 1940s Citco has provided specialist financial services to alternative investment funds investorsmultinationals and private clients worldwide. With over 6000 employees in 45 countries we pioneer innovative solutions that meet our clients evolving needs and deliver exceptional service.
Our continuous investment in learning means our people are among the best in the industry. And our corporate social responsibility programs provide meaningful and fulfilling work in the community.
A career at Citco isnt just a job its an opportunity to excel in an environment that genuinely supports your personal and professional development.
About the Role:
You will be working in a cross-functional team using agile methodologies to build and maintain data pipelines on the Databricks Lakehouse Platform for the financial services industry. As a Data Engineer youll collaborate with data scientists analysts and business stakeholders to transform raw financial data into actionable insights. Using modern data engineering practices youll develop scalable ETL/ELT processes implement data quality controls and ensure data governance standards are met. Working within our AWS cloud environment youll help build robust data solutions that power critical business operations while maintaining the highest standards of data security and compliance.
Your Role:
You will participate and contribute on all team activities such as Sprint Planning Sprint Execution Daily Scrum
Develop and maintain data pipelines using Databricks Lakehouse and Delta Lake
Implement ETL/ELT workflows using Spark (Python) in Databricks environment
Work with AWS services (S3 Glue) for data lake storage and catalog management
Create and optimize Spark jobs for efficient data processing and cost management
Build and maintain data quality checks and monitoring systems
Configure and manage Databricks notebooks and jobs
Implement proper security and access controls using Unity Catalog
Participate in code reviews and documentation efforts
Stay current with Databricks features and data engineering best practices
Support real-time data processing using structured streaming when required
About You:
Bachelors degree in Computer Science Engineering or related field
4 years of experience in data engineering
1 years of hands-on experience with Databricks platform
Strong programming skills in Python
Experience with Spark and distributed computing
Working knowledge of AWS services (S3 Glue Lambda)
Experience with Delta Lake and Lakehouse architecture
Familiarity with data modelling and SQL
Understanding of ETL/ELT principles and patterns
Experience with version control systems (Git)
Good communication and collaboration abilities
Experience with CI/CD for data pipelines
Familiarity with Agile development methodologies
Experience with real-time data processing is a plus
Self-motivated with ability to work independently
Our Benefits
Your well-being is of paramount importance to us and central to our success. We provide a range of benefits training and education support and flexible working arrangements to help you achieve success in your career while balancing personal needs. Ask us about specific benefits in your location.
We embrace diversity prioritizing the hiring of people from diverse backgrounds. Our inclusive culture is a source of pride and strength fostering innovation and mutual respect.
Citco welcomes and encourages applications from people with disabilities. Accommodation is available upon request for candidates taking part in all aspects of the selection.
At Citco, we don't just provide bespoke solutions and better results. We’re a true partner dedicated to developing rich, long-term relationships through gold standard services.