Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailNot Disclosed
Salary Not Disclosed
1 Vacancy
Were looking for a Data Engineer whos not only technically strong but also thrives in ambiguity can work through legacy constraints and is passionate about making data reliable timely and actionable.
As a Data Engineer youll be responsible for designing maintaining and evolving our modern data stack - including Snowflake Airbyte dbt and orchestration tools such as Mage and N8N. Youll work closely with data scientists analysts and business stakeholders to improve how we collect model and serve data across the organization.
This role is ideal for someone who enjoys building reliable systems working with cross-functional teams and ensuring data is timely accurate accessible and actionable.
Design and maintain scalable reliable data pipelines for batch and real-time ingestion using Airbyte REST APIs SFTP and other sources.
Develop maintain and optimize data models in SQL & NoSQL databases Snowflake using dbt with a focus on developing robust data models (eg. Dimensional modeling Data Vault OBT strategies).
Writing clear documentation implementing tests to ensure data quality and version control of code.
Build and manage data orchestration workflows with Jenkins Mage and N8N to ensure reliable automation and timely data availability.
Monitor and manage data quality integrity and performance.
Administer and optimize Snowflake including IAM management cost optimization and ensuring data integrity and security.
Develop and actively contribute to the architecture of the future-state data platform.
Work with stakeholders to define data requirements and ensure models and pipelines meet business and compliance needs.
A chance to help shape the future of data in a company where your work will have immediate impact.
Real-world technical challenges - not everything is perfect but youll have the opportunity to fix and improve things.
Your pipelines and models will directly support critical workflows such as regulatory reporting financial reconciliation and machine learning operations.
Be part of a collaborative impact-driven team of experienced and supportive professionals.
Hybrid work model (we have nice offices in Prague - Karln).
Home office reimbursement.
Informal and pleasant atmosphere - we all know and respect each other and we also have a pack of dogs.
Promoting a healthy lifestyle - we do offer flexible working hours drinking and fruit daily in the office team events up to 34 days of vacation or additional 5 days paid leave for parents after child birth etc.
5 years in Data Engineering or a similar role in a modern data environment (not necessarily perfect but moving in the right direction).
Expertise in:
SQL (PostgreSQL MySQL Maria DB complex queries performance tuning)
Python for data transformation automation and integration workflows
dbt (including testing macros packages)
NoSQL databases (MongoDB DynamoDB Cassandra or similar) for flexible high-volume data storage.
Snowflake (or similar cloud data warehouse)
Airbyte or similar ingestion tools (Fivetran Stitch)
Orchestration tools like Mage N8N Airflow or similar
Experience working with cloud platforms (eg. GCP AWS or similar)
Familiarity with data governance version control (git) CI/CD and documentation
Strong understanding of data architecture principles and scalable design patterns
Experience in regulated environment (e.g. fintech banking)
Exposure to financial data (PnL credit provisioning etc.)
Understanding of data security IAM and compliance requirements in cloud environments
Your application has been successfully submitted!
Full-Time