Our Client
Our client is a fintech-software company part of a leading European financial group delivering advanced software solutions and data-driven services. They specialise in banking-tech platforms and data engineering supporting the groups digital-transformation ambitions.
Responsibilities:
Data Pipeline Development
- Design and implement cloud-based data pipelines to transform raw data from multiple sources (e.g. APIs FTP SFTP) into structured and consumable formats.
- Architect and deploy high-performance data pipelines (batch and streaming) including monitoring and alerting mechanisms for data analysts and business users.
Data Management
- Build and manage large-scale complex datasets that meet functional and non-functional business requirements.
- Support data governance initiatives including metadata cataloguing and documentation.
- Implement functional and non-functional data tests to ensure quality and reliability.
Data Modelling and Analysis
- Design data models and create appropriate database objects such as tables views procedures and scripts.
- Analyze and optimize queries to ensure performance and scalability.
Collaboration & Production Support
- Contribute to monitoring troubleshooting and user support activities.
- Collaborate with cross-functional teams to gather requirements and deliver effective solutions.
Continuous Improvement
- Develop code in line with established architectural standards continuously improving scalability and performance.
- Stay current with industry trends and emerging technologies to evolve data engineering practices.
Requirements:
Mandatory Skills
Additional Assets
-
Strong experience developing solutions on Snowflake.
-
Hands-on experience with CI/CD practices.
- Experience using orchestration tools (e.g. Step Functions Airflow Prefect).
- Familiarity with version control systems such as GitHub or GitLab.
-
Exposure to data visualization tools (e.g. Tableau) is a plus.
- Knowledge of Infrastructure as Code (e.g. CloudFormation Terraform).
- AWS certification (e.g. Solutions Architect or equivalent) is preferred.
-
Snowflake certification (e.g. SnowPro Core) is preferred.
Our Client Our client is a fintech-software company part of a leading European financial group delivering advanced software solutions and data-driven services. They specialise in banking-tech platforms and data engineering supporting the groups digital-transformation ambitions. Responsibilities: Dat...
Our Client
Our client is a fintech-software company part of a leading European financial group delivering advanced software solutions and data-driven services. They specialise in banking-tech platforms and data engineering supporting the groups digital-transformation ambitions.
Responsibilities:
Data Pipeline Development
- Design and implement cloud-based data pipelines to transform raw data from multiple sources (e.g. APIs FTP SFTP) into structured and consumable formats.
- Architect and deploy high-performance data pipelines (batch and streaming) including monitoring and alerting mechanisms for data analysts and business users.
Data Management
- Build and manage large-scale complex datasets that meet functional and non-functional business requirements.
- Support data governance initiatives including metadata cataloguing and documentation.
- Implement functional and non-functional data tests to ensure quality and reliability.
Data Modelling and Analysis
- Design data models and create appropriate database objects such as tables views procedures and scripts.
- Analyze and optimize queries to ensure performance and scalability.
Collaboration & Production Support
- Contribute to monitoring troubleshooting and user support activities.
- Collaborate with cross-functional teams to gather requirements and deliver effective solutions.
Continuous Improvement
- Develop code in line with established architectural standards continuously improving scalability and performance.
- Stay current with industry trends and emerging technologies to evolve data engineering practices.
Requirements:
Mandatory Skills
Additional Assets
-
Strong experience developing solutions on Snowflake.
-
Hands-on experience with CI/CD practices.
- Experience using orchestration tools (e.g. Step Functions Airflow Prefect).
- Familiarity with version control systems such as GitHub or GitLab.
-
Exposure to data visualization tools (e.g. Tableau) is a plus.
- Knowledge of Infrastructure as Code (e.g. CloudFormation Terraform).
- AWS certification (e.g. Solutions Architect or equivalent) is preferred.
-
Snowflake certification (e.g. SnowPro Core) is preferred.
View more
View less