IN A NUTSHELL
Do you want autonomy to solve problems that matter To work with a modern data stack (that actually works) To have direct impact on a global brands success
This role has all three. Our data infrastructure (including Kafka AWS/GCP Python R) is solid our problems are interesting and your work will directly accelerate how we operate and make decisions. This isnt about maintaining pipelines its about turning data into a strategic enabler for the entire business.
So whats the actual work Diving deep into complex pipeline issues while contributing to our strategic direction and architecture. Bridging engineering excellence and business impact through collaboration across every level of the organization. Strengthening our engineering function alongside a talented team that brings both deep institutional knowledge and fresh perspectives in a team and company culture thats (very) low on politics and ego. The goal: building systems that dont just serve the business they propel it forward.
If youre the kind of engineer who gets genuinely excited about turning chaos into order who sees tech debt reduction as a strategic opportunity and who believes the best engineering happens when youre embedded with the business teams who depend on your work then we should talk. If you prefer pure technical work without collaborating with the stakeholders who will use your data or pure management without getting your hands dirty in the code this role isnt for you. We need someone who thrives in both worlds and owns the results.
Join our team that thinks deeply about craft who choose long-term excellence over quick fixes and who solve problems that genuinely matter. Join our culture of smart people with good intentions who get s**t done. This isnt just another data job; its where technical depth meets business impact in ways that actually matter.
THESE ARE SOME QUALITIES YOU MUST POSSESS
- At least 4 years of experience in data engineering roles with demonstrated progression toward technical leadership
- Advanced SQL expertise and hands-on experience building and optimising data pipelines using modern orchestration tools (Apache NiFi Airflow or similar)
- Strong debugging skills with the ability to perform root cause analyses on complex data pipeline issues
- Proven ability to maintain and evolve production data infrastructure in cloud environments (AWS/GCP)
- Track record of improving data quality reliability and operational excellence at scale
- Experience working directly with business stakeholders to understand their needs and translate them into data solutions
- Demonstrated ownership mindset: you see problems through to resolution rather than passing them along our systems are stable enough that critical out-of-hours issues come up only a handful of times a year and we want to keep it that way
- Exceptional precision in your work knowing how to verify correctness even when dealing with unreliable data
WED BE EXCITED IF YOU HAVE (but please apply even if you dont):
- Hands-on experience with our specific stack: Kafka Python R PostgreSQL AWS Aurora Google BigQuery
- Background in physical product companies - you understand e-commerce inventory and supply chain data challenges
- A track record of successfully mentoring engineers and helping teams level up their technical capabilities
- Project management experience with data initiatives including prioritization frameworks
- Implementation knowledge of data governance privacy standards and security best practices
- Proven ability to build and maintain data science infrastructure (feature stores ML pipelines)
- A genuine passion for data warehouse organization that borders on the obsessive (in the best way)
- Contributions to data engineering communities open source projects or technical writing
IF YOU WERE HERE LAST WEEK YOU MIGHT HAVE:
- Paired with an engineer to debug a data lag issue using it as an opportunity to document the solution approach in the team Wiki
- Noticed Amazon data quality alerts recurring dug into the root causes and built better monitoring to catch issues earlierbecause wed rather invest time improving our defenses than constantly firefighting
- Triaged the engineering backlog with your team and identified a solution that fixes supplier data quality issues while migrating a NiFi flow to dbt
- Partnered with Finance to define requirements for a new reporting initiative negotiating for an extra week to build it right rather than rushing a fragile solution that would break later
- Coached an engineer through prioritisation challenges during your one-on-one helping them communicate a clear plan to stakeholders
- Shaped the data engineering roadmap with other managers balancing big upcoming projects with critical infrastructure improvements
- Demoed your BigQuery MCP experiments for lineage analysis at the monthly Data Carousel then learned from the data science teams causal inference work on marketing spend
WHY WORK FOR BELLROY
At Bellroy it takes a wonderfully diverse crew to make everything tick. Were a close-knit group of thinkers and makers from over 25 different countries each contributing unique skills to achieve our shared vision. We believe that embracing diverse backgrounds and perspectives is key to staying agile and resilient. So even if your experience isnt an exact match but you feel you have something special to contribute wed love to hear from you.
Bellroy is committed to making our hiring process accessible to everyone including individuals with disabilities. If you need reasonable accommodations at any stagewhether its applying interviewing completing pre-employment testing or otherwise participating in the selection processplease contact us at Include your full name the best way to reach you and the type of accommodation you need to support you throughout the application process. Were here to help and ensure you have the best possible experience.
IN A NUTSHELLDo you want autonomy to solve problems that matter To work with a modern data stack (that actually works) To have direct impact on a global brands successThis role has all three. Our data infrastructure (including Kafka AWS/GCP Python R) is solid our problems are interesting and your wo...
IN A NUTSHELL
Do you want autonomy to solve problems that matter To work with a modern data stack (that actually works) To have direct impact on a global brands success
This role has all three. Our data infrastructure (including Kafka AWS/GCP Python R) is solid our problems are interesting and your work will directly accelerate how we operate and make decisions. This isnt about maintaining pipelines its about turning data into a strategic enabler for the entire business.
So whats the actual work Diving deep into complex pipeline issues while contributing to our strategic direction and architecture. Bridging engineering excellence and business impact through collaboration across every level of the organization. Strengthening our engineering function alongside a talented team that brings both deep institutional knowledge and fresh perspectives in a team and company culture thats (very) low on politics and ego. The goal: building systems that dont just serve the business they propel it forward.
If youre the kind of engineer who gets genuinely excited about turning chaos into order who sees tech debt reduction as a strategic opportunity and who believes the best engineering happens when youre embedded with the business teams who depend on your work then we should talk. If you prefer pure technical work without collaborating with the stakeholders who will use your data or pure management without getting your hands dirty in the code this role isnt for you. We need someone who thrives in both worlds and owns the results.
Join our team that thinks deeply about craft who choose long-term excellence over quick fixes and who solve problems that genuinely matter. Join our culture of smart people with good intentions who get s**t done. This isnt just another data job; its where technical depth meets business impact in ways that actually matter.
THESE ARE SOME QUALITIES YOU MUST POSSESS
- At least 4 years of experience in data engineering roles with demonstrated progression toward technical leadership
- Advanced SQL expertise and hands-on experience building and optimising data pipelines using modern orchestration tools (Apache NiFi Airflow or similar)
- Strong debugging skills with the ability to perform root cause analyses on complex data pipeline issues
- Proven ability to maintain and evolve production data infrastructure in cloud environments (AWS/GCP)
- Track record of improving data quality reliability and operational excellence at scale
- Experience working directly with business stakeholders to understand their needs and translate them into data solutions
- Demonstrated ownership mindset: you see problems through to resolution rather than passing them along our systems are stable enough that critical out-of-hours issues come up only a handful of times a year and we want to keep it that way
- Exceptional precision in your work knowing how to verify correctness even when dealing with unreliable data
WED BE EXCITED IF YOU HAVE (but please apply even if you dont):
- Hands-on experience with our specific stack: Kafka Python R PostgreSQL AWS Aurora Google BigQuery
- Background in physical product companies - you understand e-commerce inventory and supply chain data challenges
- A track record of successfully mentoring engineers and helping teams level up their technical capabilities
- Project management experience with data initiatives including prioritization frameworks
- Implementation knowledge of data governance privacy standards and security best practices
- Proven ability to build and maintain data science infrastructure (feature stores ML pipelines)
- A genuine passion for data warehouse organization that borders on the obsessive (in the best way)
- Contributions to data engineering communities open source projects or technical writing
IF YOU WERE HERE LAST WEEK YOU MIGHT HAVE:
- Paired with an engineer to debug a data lag issue using it as an opportunity to document the solution approach in the team Wiki
- Noticed Amazon data quality alerts recurring dug into the root causes and built better monitoring to catch issues earlierbecause wed rather invest time improving our defenses than constantly firefighting
- Triaged the engineering backlog with your team and identified a solution that fixes supplier data quality issues while migrating a NiFi flow to dbt
- Partnered with Finance to define requirements for a new reporting initiative negotiating for an extra week to build it right rather than rushing a fragile solution that would break later
- Coached an engineer through prioritisation challenges during your one-on-one helping them communicate a clear plan to stakeholders
- Shaped the data engineering roadmap with other managers balancing big upcoming projects with critical infrastructure improvements
- Demoed your BigQuery MCP experiments for lineage analysis at the monthly Data Carousel then learned from the data science teams causal inference work on marketing spend
WHY WORK FOR BELLROY
At Bellroy it takes a wonderfully diverse crew to make everything tick. Were a close-knit group of thinkers and makers from over 25 different countries each contributing unique skills to achieve our shared vision. We believe that embracing diverse backgrounds and perspectives is key to staying agile and resilient. So even if your experience isnt an exact match but you feel you have something special to contribute wed love to hear from you.
Bellroy is committed to making our hiring process accessible to everyone including individuals with disabilities. If you need reasonable accommodations at any stagewhether its applying interviewing completing pre-employment testing or otherwise participating in the selection processplease contact us at Include your full name the best way to reach you and the type of accommodation you need to support you throughout the application process. Were here to help and ensure you have the best possible experience.
View more
View less