Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailNot Disclosed
Salary Not Disclosed
1 Vacancy
Location: Anywhere in LATAM
Job Type: Remote
Project: Data Engineering for US-based Health Client
Time Zone: GMT-3 to GMT-5 preferred
English Level: B2 / C1
At Darwoft we build digital products with heart. Were a Latin American tech company focused on creating impactful human-centered software in partnership with companies around the globe. Our remote-first culture is based on trust continuous learning and collaboration.
Were passionate about tech but even more about people. If youre looking to join a team where your ideas matter and your impact is real welcome to Darwoft.
Youll be joining a fast-moving collaborative environment where your role will focus on designing and optimizing data pipelines using Python Snowflake Airflow and AWS. Youll work closely with data scientists and analysts to build scalable solutions that support critical business decisions.
70% Data Pipeline Development
Design and optimize ingestion storage and transformation pipelines using Python SQL Snowflake and Snowpark
Build and enhance real-time data pipelines with AWS Lambda and Snowpipe
Collaborate with data scientists and analysts to deliver business-ready datasets
Create internal and external data views (logical materialized and secure)
Test and evaluate new features in Snowflake Airflow and AWS for proof of concepts
15% Code Review
Participate in peer code reviews and provide constructive feedback
Maintain clean efficient and scalable code
10% Agile Collaboration
Join sprint ceremonies (planning stand-ups reviews retrospectives)
Ensure alignment with stakeholders on deliverables and timelines
5% Release Support
Coordinate deployments with PMs and IT
Ensure smooth release cycles with minimal downtime
5 years of experience as a Data Engineer
Strong expertise with Snowflake and orchestration tools like Airflow
Advanced Python and SQL programming skills
Hands-on experience with AWS services: Lambda S3 and real-time data streaming
Solid understanding of ELT pipelines data modeling and efficient storage strategies
Great communication and collaboration skills
Experience working with healthcare data in the US
Familiarity with data privacy regulations (HIPAA GDPR)
Experience with Snowpark and Snowflakes data sharing capabilities
Bachelors degree in Computer Science Engineering or a related field
Full Time