GSCF
GSCF is the leading global provider of working capital solutions. The company empowers companies and their financial institution partners to accelerate growth unlock liquidity and manage the risk and complexity of the end-to-end working capital cycle. GSCFs innovative Working Capital-as-a-Service offering combines the power of an end-to-end connected capital technology platform with expert managed services and alternative capital solutions.
GSCFs team of working capital experts operates in over 75 countries offering a truly global and holistic perspective to solve working capital efficiency challenges.
Visitto learn more.
The Role
We are expanding and elevating our data infrastructure and were looking for a highly skilled Data Platform Engineer to join our growing Data Platform team in Budapest.
This is a hands-on high-impact role where you will design build and operate scalable data infrastructure data warehousing and lakehouse solutions that power decisionmaking and client-facing data products across the entire company.
Our Data Platform team is fully internal colocated in Budapest and works closely with Product Operations Finance Engineering and leadership. You wont be siloed; you will be part of a core function that supports the whole business. To succeed here technical excellence is only half the story. The other half is developing a strong understanding of how our business works and how data can drive our strategy.
If you enjoy ownership solving complex data challenges and shaping a modern data ecosystem wed love to hear from you.
How You Will Make an Impact:
Data Infrastructure & Architecture
- Build and optimize scalable data warehousing and lakehouse solutions and ensure data integrity endtoend
- Create and own robust ELT pipelines connecting multiple internal and external data sources
- Design and implement data models aligned with industry best practices
- Contribute to architectural decisions and bring senior-level technical depth
- Track structure and maintain key business metrics and KPIs
Data Governance & Quality
- Build and maintain governance processes: data lineage metadata quality monitoring
- Implement automated data quality checks validation and testing
- Define and enforce data standards naming conventions and documentation practices
Collaboration & Leadership
- Partner with Product Operations Finance and other teams to understand data needs
- Mentor and guide team members to elevate engineering standards
- Translate business requirements into robust technical solutions
- Ensure compliance with data governance privacy and security standards
- Drive adoption of best practices across the organization
What You Bring to the Team:
Required Experience
- 5 years of hands-on SQL experience (analytical queries performance tuning)
- 3 years of designing and operating large-scale data warehouse systems
- Strong knowledge of data modeling: dimensional modeling schema design architecture patterns
- 3 years of experience with a major cloud provider (AWS / Azure / GCP)
- Experience with orchestration tools (Airflow Dagster etc.)
- 5 years of real programming experience (e.g. Python Java Scala)
- Experience with modern cloud data platforms: Snowflake Redshift Databricks BigQuery
Preferred Experience
- Deep dbt experience (modeling testing CI/CD performance tuning)
- Hands-on experience integrating dbt with orchestration tools and cloud workflows
Nice to Have
- Ability to understand the business and translate needs into solutions
- Snowflake expertise (data loading table design permission models DDL/DML views)
- Experience with BI tools (Power BI Tableau etc.; DAX Power Query)
- Infrastructure as code (Terraform AWS CDK KDE)
- Streaming/Kafka experience (event driven architectures real time data pipelines schema design and operating streaming systems at scale)
All qualified applicants will receive consideration for employment without regard to race color religion sex agedisability sexual orientation national origin or any other category protected by law.
Privacy Notice
Please note that your personal data will be processed in accordance with GSCF Privacy Notice for Job Candidates. When submitting your application you therefore acknowledge and confirm that you understand that your personal data will be processed in accordance with the above-mentioned Privacy Notice. Should you have any questions regarding the processing of your personal data by GSCF please contact us at: .
*Please note: Internally this position is referred to as Senior Software Engineer - Data & Funding.
Required Experience:
Senior IC
GSCFGSCF is the leading global provider of working capital solutions. The company empowers companies and their financial institution partners to accelerate growth unlock liquidity and manage the risk and complexity of the end-to-end working capital cycle. GSCFs innovative Working Capital-as-a-Servic...
GSCF
GSCF is the leading global provider of working capital solutions. The company empowers companies and their financial institution partners to accelerate growth unlock liquidity and manage the risk and complexity of the end-to-end working capital cycle. GSCFs innovative Working Capital-as-a-Service offering combines the power of an end-to-end connected capital technology platform with expert managed services and alternative capital solutions.
GSCFs team of working capital experts operates in over 75 countries offering a truly global and holistic perspective to solve working capital efficiency challenges.
Visitto learn more.
The Role
We are expanding and elevating our data infrastructure and were looking for a highly skilled Data Platform Engineer to join our growing Data Platform team in Budapest.
This is a hands-on high-impact role where you will design build and operate scalable data infrastructure data warehousing and lakehouse solutions that power decisionmaking and client-facing data products across the entire company.
Our Data Platform team is fully internal colocated in Budapest and works closely with Product Operations Finance Engineering and leadership. You wont be siloed; you will be part of a core function that supports the whole business. To succeed here technical excellence is only half the story. The other half is developing a strong understanding of how our business works and how data can drive our strategy.
If you enjoy ownership solving complex data challenges and shaping a modern data ecosystem wed love to hear from you.
How You Will Make an Impact:
Data Infrastructure & Architecture
- Build and optimize scalable data warehousing and lakehouse solutions and ensure data integrity endtoend
- Create and own robust ELT pipelines connecting multiple internal and external data sources
- Design and implement data models aligned with industry best practices
- Contribute to architectural decisions and bring senior-level technical depth
- Track structure and maintain key business metrics and KPIs
Data Governance & Quality
- Build and maintain governance processes: data lineage metadata quality monitoring
- Implement automated data quality checks validation and testing
- Define and enforce data standards naming conventions and documentation practices
Collaboration & Leadership
- Partner with Product Operations Finance and other teams to understand data needs
- Mentor and guide team members to elevate engineering standards
- Translate business requirements into robust technical solutions
- Ensure compliance with data governance privacy and security standards
- Drive adoption of best practices across the organization
What You Bring to the Team:
Required Experience
- 5 years of hands-on SQL experience (analytical queries performance tuning)
- 3 years of designing and operating large-scale data warehouse systems
- Strong knowledge of data modeling: dimensional modeling schema design architecture patterns
- 3 years of experience with a major cloud provider (AWS / Azure / GCP)
- Experience with orchestration tools (Airflow Dagster etc.)
- 5 years of real programming experience (e.g. Python Java Scala)
- Experience with modern cloud data platforms: Snowflake Redshift Databricks BigQuery
Preferred Experience
- Deep dbt experience (modeling testing CI/CD performance tuning)
- Hands-on experience integrating dbt with orchestration tools and cloud workflows
Nice to Have
- Ability to understand the business and translate needs into solutions
- Snowflake expertise (data loading table design permission models DDL/DML views)
- Experience with BI tools (Power BI Tableau etc.; DAX Power Query)
- Infrastructure as code (Terraform AWS CDK KDE)
- Streaming/Kafka experience (event driven architectures real time data pipelines schema design and operating streaming systems at scale)
All qualified applicants will receive consideration for employment without regard to race color religion sex agedisability sexual orientation national origin or any other category protected by law.
Privacy Notice
Please note that your personal data will be processed in accordance with GSCF Privacy Notice for Job Candidates. When submitting your application you therefore acknowledge and confirm that you understand that your personal data will be processed in accordance with the above-mentioned Privacy Notice. Should you have any questions regarding the processing of your personal data by GSCF please contact us at: .
*Please note: Internally this position is referred to as Senior Software Engineer - Data & Funding.
Required Experience:
Senior IC
View more
View less