Highlights
Remote-first role open to candidates from Brazil / South America Turkey and Northern Africa
Part of a data-driven SaaS platform focused on large-scale data collection integration and quality
Fully remote
Flexible working hours within a small international team
About the company
Our client is a Berlin-based remote-first SaaS company developing data-driven products for international clients.
They combine cutting-edge data technologies with a culture of freedom ownership and collaboration.
Youll join a small but highly skilled international team that values innovation transparency and impact.
The company offers an environment where engineers can shape architecture decisions work closely with product and data teams and directly see the results of their work in production.
Role Overview
Were looking for a Data Engineer / Python Developer who enjoys solving complex problems building reliable data systems and working in a hands-on collaborative environment.
Youll be part of a data team responsible for data collection integration and quality contributing directly to the evolution of our core data infrastructure.
Responsibilities
Develop high-quality maintainable Python code for web scraping and data processing
Scale and maintain asynchronous data collection systems in production
Build and support REST APIs and internal data interfaces
Contribute to DevOps tasks (Docker CI/CD Linux) and conduct code reviews
Work with large datasets ensuring data accuracy and consistency
Participate in planning estimation and technical decisions
Continuously improve workflows and implement automation
Requirements
Solid hands-on experience with Python Git and object-oriented programming
Experience working with SQL and NoSQL databases
Familiarity with web scraping frameworks (Scrapy Playwright BeautifulSoup or similar)
Experience with REST APIs (FastAPI Flask or Django REST)
Understanding of DevOps tools: Docker CI/CD Linux
Comfortable working in agile collaborative environments
Fluent in English (written and spoken)
Nice to have:
Experience with asynchronous systems (asyncio Celery etc.)
Basic familiarity with cloud environments (AWS GCP or similar)
Pragmatic mindset and focus on delivering value
Whats in it for you
Competitive compensation
Flexible working hours
Fully remote work within the listed regions
Modern tech stack and real ownership over your work
Regular team meetups (both online and on-site)
Strong learning culture and support for professional development
Support of your personal & professional development;
Interested
Apply now with your CV and salary expectations.
Well be happy to share more details about the company and the hiring process during our initial conversation.
HighlightsRemote-first role open to candidates from Brazil / South America Turkey and Northern AfricaPart of a data-driven SaaS platform focused on large-scale data collection integration and qualityFully remoteFlexible working hours within a small international teamAbout the companyOur client is a...
Highlights
Remote-first role open to candidates from Brazil / South America Turkey and Northern Africa
Part of a data-driven SaaS platform focused on large-scale data collection integration and quality
Fully remote
Flexible working hours within a small international team
About the company
Our client is a Berlin-based remote-first SaaS company developing data-driven products for international clients.
They combine cutting-edge data technologies with a culture of freedom ownership and collaboration.
Youll join a small but highly skilled international team that values innovation transparency and impact.
The company offers an environment where engineers can shape architecture decisions work closely with product and data teams and directly see the results of their work in production.
Role Overview
Were looking for a Data Engineer / Python Developer who enjoys solving complex problems building reliable data systems and working in a hands-on collaborative environment.
Youll be part of a data team responsible for data collection integration and quality contributing directly to the evolution of our core data infrastructure.
Responsibilities
Develop high-quality maintainable Python code for web scraping and data processing
Scale and maintain asynchronous data collection systems in production
Build and support REST APIs and internal data interfaces
Contribute to DevOps tasks (Docker CI/CD Linux) and conduct code reviews
Work with large datasets ensuring data accuracy and consistency
Participate in planning estimation and technical decisions
Continuously improve workflows and implement automation
Requirements
Solid hands-on experience with Python Git and object-oriented programming
Experience working with SQL and NoSQL databases
Familiarity with web scraping frameworks (Scrapy Playwright BeautifulSoup or similar)
Experience with REST APIs (FastAPI Flask or Django REST)
Understanding of DevOps tools: Docker CI/CD Linux
Comfortable working in agile collaborative environments
Fluent in English (written and spoken)
Nice to have:
Experience with asynchronous systems (asyncio Celery etc.)
Basic familiarity with cloud environments (AWS GCP or similar)
Pragmatic mindset and focus on delivering value
Whats in it for you
Competitive compensation
Flexible working hours
Fully remote work within the listed regions
Modern tech stack and real ownership over your work
Regular team meetups (both online and on-site)
Strong learning culture and support for professional development
Support of your personal & professional development;
Interested
Apply now with your CV and salary expectations.
Well be happy to share more details about the company and the hiring process during our initial conversation.
View more
View less