Our clients people-first mindset has lead them to invest in technology and teamwork so they can take care of the environment while also supporting the growth and success of their employees. If you re passionate about sustainability and looking for a place where your work makes a real difference you ll feel right at home with them.
Are you excited by the idea of shaping modern data ecosystems We re searching for a motivated Data Engineer who will play a key role in building scalable analytics infrastructure. This position offers both technical challenges with large complex datasets and the creative autonomy of a startup setting.
Your Impact
Architect and maintain resilient data pipelines handling both real-time events and large batch jobs.
Enhance our data lake platform to support efficient data storage discoverability and retrieval.
Collaborate closely with product managers analysts and engineers to translate requirements into scalable solutions.
Apply Infrastructure as Code practices using Terraform to automate provisioning and resource management.
Monitor troubleshoot and tune performance to ensure smooth pipeline operations.
Stay informed on industry trends and adopt new tools or frameworks that make our platform more powerful.
Take full ownership of initiatives driving projects from concept through delivery.
Requirements
What You ll Bring
3-5 years of professional experience as a Data Engineer or similar role.
Solid background in AWS services.
Strong programming ability in Python and SQL.
Skills in handling big data workloads with Apache Spark and Delta Lake.
Proficiency in Terraform (Infrastructure as Code).
Experience with CI/CD and DevOps workflows using GitHub Azure DevOps or similar tools.
Extra points for working knowledge of Kafka (real-time streaming) and geospatial datasets.
Why You ll Love Working Here
Tackle challenging large-scale datasets that keep you learning and growing.
Contribute directly to shaping technical strategy in a high-energy startup-like culture.
Enjoy autonomy on projects while collaborating with a skilled passionate team.
Expand your career leveraging the latest cloud and data technologies.
Be part of an environment that values innovation professional growth and your unique perspective.
AWS (Glue, Lambda, EC2, S3), Cloud Platforms (AWS, Azure), Python, SQL, Apache Spark, Delta Lake, Big Data Processing, Terraform, Infrastructure as Code (IaC), CI/CD, DevOps Tools (GitHub, Azure DevOps, Bitbucket, Jenkins), Data Pipelines (Batch, Streaming, Real-Time), Kafka, Event Streaming, Geospatial Data, Data Lake Architecture, ETL Development, Cloud Infrastructure Automation
Our clients people-first mindset has lead them to invest in technology and teamwork so they can take care of the environment while also supporting the growth and success of their employees. If you re passionate about sustainability and looking for a place where your work makes a real difference yo...
Our clients people-first mindset has lead them to invest in technology and teamwork so they can take care of the environment while also supporting the growth and success of their employees. If you re passionate about sustainability and looking for a place where your work makes a real difference you ll feel right at home with them.
Are you excited by the idea of shaping modern data ecosystems We re searching for a motivated Data Engineer who will play a key role in building scalable analytics infrastructure. This position offers both technical challenges with large complex datasets and the creative autonomy of a startup setting.
Your Impact
Architect and maintain resilient data pipelines handling both real-time events and large batch jobs.
Enhance our data lake platform to support efficient data storage discoverability and retrieval.
Collaborate closely with product managers analysts and engineers to translate requirements into scalable solutions.
Apply Infrastructure as Code practices using Terraform to automate provisioning and resource management.
Monitor troubleshoot and tune performance to ensure smooth pipeline operations.
Stay informed on industry trends and adopt new tools or frameworks that make our platform more powerful.
Take full ownership of initiatives driving projects from concept through delivery.
Requirements
What You ll Bring
3-5 years of professional experience as a Data Engineer or similar role.
Solid background in AWS services.
Strong programming ability in Python and SQL.
Skills in handling big data workloads with Apache Spark and Delta Lake.
Proficiency in Terraform (Infrastructure as Code).
Experience with CI/CD and DevOps workflows using GitHub Azure DevOps or similar tools.
Extra points for working knowledge of Kafka (real-time streaming) and geospatial datasets.
Why You ll Love Working Here
Tackle challenging large-scale datasets that keep you learning and growing.
Contribute directly to shaping technical strategy in a high-energy startup-like culture.
Enjoy autonomy on projects while collaborating with a skilled passionate team.
Expand your career leveraging the latest cloud and data technologies.
Be part of an environment that values innovation professional growth and your unique perspective.
AWS (Glue, Lambda, EC2, S3), Cloud Platforms (AWS, Azure), Python, SQL, Apache Spark, Delta Lake, Big Data Processing, Terraform, Infrastructure as Code (IaC), CI/CD, DevOps Tools (GitHub, Azure DevOps, Bitbucket, Jenkins), Data Pipelines (Batch, Streaming, Real-Time), Kafka, Event Streaming, Geospatial Data, Data Lake Architecture, ETL Development, Cloud Infrastructure Automation
View more
View less