We are a leading trading platform that is ambitiously expanding to the four corners of the globe. Our top-rated products have won prestigious industry awards for their cutting-edge technology and seamless client experience. We deliver only the best so we are always in search of the best people to join our ever-growing talented team.
We are seeking a Data Engineer to join our Financial Analytics team developing and maintaining data pipelines transformation logic and data quality checks in a complex multi-jurisdiction financial data platform (PostgreSQL DWH Airflow orchestration).
Responsabilities
- Implement enhancements and changes to existing reporting processes to improve accuracy performance and usability
- Design and develop new reporting pipelines and datasets aligned with business requirements
- Automate of data delivery processes
- Identify and implementation of automated data quality checks
- Resolve of issues related to data quality
- Collaborate with business stakeholders to gather reporting requirements clarify logic and ensure outputs meet expectations
Requirements
- 4 years in analytics engineering or similar data-focused roles
- Advanced PostgreSQL: stored procedures and functions complex CTEs window functions SCD2 patterns query plan analysis and optimisation
- Strong understanding of data warehouse architecture: staging core and data mart layers; incremental load patterns; slowly changing dimensions
- Hands-on experience with Apache Airflow: DAG authoring scheduling dependency management and failure handling
- Proficiency with Git (GitLab or GitHub) and JIRA
- Experience designing and evolving data warehouse architecture and data models
- Track record of building robust maintainable ELT/ETL pipelines in production
- Experience implementing automated data quality checks
- Domain fluency in financial and trading concepts with the ability to understand requirements and clearly explain implemented logic to business stakeholders
- High degree of autonomy: able to reverse-engineer undocumented systems identify root causes and take end-to-end ownership of pipelines and calculation logic
- Comfortable using AI-assisted development tools (e.g. Claude Copilot Cursor) to improve productivity
Nice to have
- Hands-on experience with dbt particularly in the context of migration or adoption initiatives
- Exposure to Snowflake or strong interest in working with it as part of a target data architecture
- Proficiency in Python for scripting automation and data pipeline tooling
- Background in fintech or financial services in any capacity
What you will get in return
- Competitive compensation
- A generous paid leave policy supporting a healthy worklife balance
- Two additional paid days per year dedicated to volunteering and giving back
- Private medical insurance for your peace of mind
- An additional flexible benefits budget allowing you to tailor benefits to your needs
- Flexible working mode
- The opportunity to work from almost anywhere in the world for up to 30 days per year
- Annual company-wide events held in locations around the globe
- In-office massages to support wellbeing
Be a key player at the forefront of the digital assets movement propelling your career to new heights!Join a dynamic and rapidly expanding company that values and rewards talent initiative and alongside one of the most brilliant teams in the industry.
Required Experience:
IC
We are a leading trading platform that is ambitiously expanding to the four corners of the globe. Our top-rated products have won prestigious industry awards for their cutting-edge technology and seamless client experience. We deliver only the best so we are always in search of the best people to jo...
We are a leading trading platform that is ambitiously expanding to the four corners of the globe. Our top-rated products have won prestigious industry awards for their cutting-edge technology and seamless client experience. We deliver only the best so we are always in search of the best people to join our ever-growing talented team.
We are seeking a Data Engineer to join our Financial Analytics team developing and maintaining data pipelines transformation logic and data quality checks in a complex multi-jurisdiction financial data platform (PostgreSQL DWH Airflow orchestration).
Responsabilities
- Implement enhancements and changes to existing reporting processes to improve accuracy performance and usability
- Design and develop new reporting pipelines and datasets aligned with business requirements
- Automate of data delivery processes
- Identify and implementation of automated data quality checks
- Resolve of issues related to data quality
- Collaborate with business stakeholders to gather reporting requirements clarify logic and ensure outputs meet expectations
Requirements
- 4 years in analytics engineering or similar data-focused roles
- Advanced PostgreSQL: stored procedures and functions complex CTEs window functions SCD2 patterns query plan analysis and optimisation
- Strong understanding of data warehouse architecture: staging core and data mart layers; incremental load patterns; slowly changing dimensions
- Hands-on experience with Apache Airflow: DAG authoring scheduling dependency management and failure handling
- Proficiency with Git (GitLab or GitHub) and JIRA
- Experience designing and evolving data warehouse architecture and data models
- Track record of building robust maintainable ELT/ETL pipelines in production
- Experience implementing automated data quality checks
- Domain fluency in financial and trading concepts with the ability to understand requirements and clearly explain implemented logic to business stakeholders
- High degree of autonomy: able to reverse-engineer undocumented systems identify root causes and take end-to-end ownership of pipelines and calculation logic
- Comfortable using AI-assisted development tools (e.g. Claude Copilot Cursor) to improve productivity
Nice to have
- Hands-on experience with dbt particularly in the context of migration or adoption initiatives
- Exposure to Snowflake or strong interest in working with it as part of a target data architecture
- Proficiency in Python for scripting automation and data pipeline tooling
- Background in fintech or financial services in any capacity
What you will get in return
- Competitive compensation
- A generous paid leave policy supporting a healthy worklife balance
- Two additional paid days per year dedicated to volunteering and giving back
- Private medical insurance for your peace of mind
- An additional flexible benefits budget allowing you to tailor benefits to your needs
- Flexible working mode
- The opportunity to work from almost anywhere in the world for up to 30 days per year
- Annual company-wide events held in locations around the globe
- In-office massages to support wellbeing
Be a key player at the forefront of the digital assets movement propelling your career to new heights!Join a dynamic and rapidly expanding company that values and rewards talent initiative and alongside one of the most brilliant teams in the industry.
Required Experience:
IC
View more
View less