Founded in 2013 Voodoo is a tech company that creates mobile games and apps with a mission to entertain the world. Gathering 800 employees 7 billion downloads and over 200 million active users Voodoo is the #3 mobile publisher worldwide in terms of downloads after Google and Meta. Our portfolio includes charttopping games like Mob Control and Block Jam alongside popular apps such as BeReal and Wizz.
Team
The Engineering & Data team builds innovative tech products and platforms to support the impressive growth of their gaming and consumer apps which allow Voodoo to stay at the forefront of the mobile industry.
Within the Data team youll join the AdNetwork Team which is an autonomous squad of around 30 people. The team is composed of toptier software engineers infrastructure engineers data engineers mobile engineers and data scientists (among which 3 Kaggle Masters). The goal of this team is to provide a way for Voodoo to monetize our inventory directly with advertising partners and relies on advanced technological solutions to optimize advertising in a realtime bidding environment. It is a strategic topic with significant impact on the business.
This role can be done fully remote in any EMEA country.
Role
Design develop and maintain scalable secure and highperformance data platforms.
Build and manage data pipelines (ETL/ELT) using tools such as Apache Airflow DBT SQLMesh or similar.
Architect and optimize lakehouse solutions (e.g. Iceberg).
Lead the design and implementation of data infrastructure components (streaming batch processing orchestration lineage observability).
Ensure data quality governance and compliance (GDPR HIPAA etc. across all data processes.
Automate infrastructure provisioning and CI/CD pipelines for data platform components using tools like Terraform CircleCI or similar.
Collaborate crossfunctionally with data scientists analytics teams and product engineers to understand data needs and deliver scalable solutions.
Mentor experienced data engineers and set best practices for code quality testing and platform reliability.
Monitor and troubleshoot performance issues in realtime data flows and longrunning batch jobs.
Stay ahead of trends in data engineering proactively recommending new technologies and approaches to keep our stack modern and efficient.
Profile (Must have)
Extensive experience in data engineering or platform engineering roles.
Strong programming skills in Python and Java.
Strong experience with modern data stacks (e.g. Spark Kafka DBT Airflow Lakehouse).
Deep understanding of distributed systems data architecture and performance tuning.
Experience with cloud platforms (AWS GCP or Azure) and InfrastructureasCode tools (Terraform CloudFormation etc..
Solid experience operating data services in Kubernetes including Helm resource tuning and service discovery.
Strong understanding of data modeling data governance and security best practices.
Knowledge of CI/CD principles and DevOps practices in a data environment.
Excellent problemsolving communication and leadership skills.
Nice to Have
Experience with realtime data streaming and eventdriven architectures.
Familiarity with ML model deployment and MLOps practices.
Exposure to data cataloging lineage tools and observability platforms.
Contributions to opensource data tools or platforms.
Benefits
Bestinclass compensation
Other benefits according to the country you reside in
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.