Finmatics is a fast-growing software as a service (SaaS) company that provides artificial intelligence (AI) solutions to thousands of businesses of all sizes. Our mission is to automate financial & accounting processes in order to eliminate unnecessary manual tasks.
As our Data Engineer youll be the architect of Finmatics data infrastructure bringing together product usage data transactional data and CRM insights from HubSpot into a unified analytics-ready environment.
Youll work hand-in-hand with Product Data Science and Sales and Marketing teams to enable deep understanding of our customers measure product impact and unlock data-driven decision-making across the company.
Your Key Responsibilities
Software Development and Strategy:
- Establish a secure internal data platform supporting analytics across departments
- Build ETL/ELT pipelines that integrate product usage data Finmatics transaction data HubSpot CRM and other sources.
- Ensure data accuracy quality and compliance (including GDPR).
- Collaborate with Product to track feature adoption and usage patterns.
- Partner with Data Science to provide clean structured datasets for modeling.
- Support Sales with customer insights for growth and retention strategies.
- Monitor troubleshoot and continuously improve data pipelines and infrastructure performance.
- Utilize modern tools including LLMs and other high-productivity technologies.
Collaboration & Sharing:
- Mentor and coach junior colleagues in technical competencies
- Collaborate with product managers and designers to define and prioritize features.
- Foster a culture of continuous learning and knowledge sharing within the team.
Agile Processes and Delivery:
- Follow agile practices participating actively in sprint planning reviews and retrospectives.
- Ensure timely project delivery while maintaining and promoting high-quality standards.
Required Qualifications
- Bachelors degree in Computer Science Engineering or a related field or equivalent practical experience.
- 3 years of experience as a Data Engineer or similar role.
- Proven expertise with data warehouse technologies (Snowflake BigQuery Redshift Databricks or similar).
- Strong skills in SQL and Python (or another scripting language).
- Experience with ETL/ELT tools (dbt Airflow Fivetran Stitch etc.).
- Cloud platform experience (AWS GCP or Azure).
- Strong understanding of data modeling for analytics and customer data platforms.
- Knowledge and active usage of modern LLM-supported development tools
- Excellent communication skills in English; German business fluency is a strong bonus.
Nice to Have
- SaaS scale-up experience.
- Knowledge of real-time data streaming (Kafka Flink).
- Familiarity with customer data platforms and churn/retention/product analytics.
- Expertise in DevOps practices and tools such as Docker Kubernetes and Jenkins.
- Strong communication and interpersonal skills.
- Experience in coaching and developing junior team members.
What We Offer
Be part of shaping the future of financial workflows using the latest technical possibilities.
- Pleasant working atmosphere in the middle of Vienna
- Possibility to work remote
- Open and honest corporate culture
- Independent work and short decision-making paths
- Motivated dynamic team and flat hierarchies
- Additional benefits include a subsidized lunch menu in our office (Schrankerl) a selection of free fruit and snacks as well as company and team events
- Flexible working hours
- Job ticket for Wiener Linien
- The minimum salary for this position is 65.000- overpayment possible depending on qualifications
Join us and help shape the future of data engineering!