Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailLocation: Anywhere in LATAM
Job Type: Remote Full Time Contractor
Project: Life Sciences & Healthcare Data Intelligence
Time Zone: EST overlap required
English Level: B2 / C1
At Darwoft we partner with cutting-edge companies around the world to build digital products that create real impact. One of our clients is a leading Life Sciences and Healthcare data intelligence company that is transforming decision-making by empowering global organizations with advanced analytics real-time insights and AI-driven platforms.
Their technology is used by some of the worlds top life sciences organizations to gain visibility on the efficacy and impact of scientific engagement helping to improve patient outcomes at scale.
By joining this team youll be contributing to a mission-driven environment where technology data and healthcare meet to create meaningful change.
Were seeking a Technical Operations Engineer who will take ownership of ensuring the stability scalability and reliability of complex data pipelines and infrastructure. This role is highly autonomous and hands-on requiring someone who thrives in a fast-paced collaborative environment with strong expertise in Python cloud platforms and data operations.
Youll work across teams to operationalize data architectures optimize pipelines and deliver robust solutions that drive both internal efficiency and customer satisfaction.
Operational Reliability: Own and ensure the health stability and performance of AI-driven data platforms pipelines and infrastructure.
Pipeline Optimization: Monitor troubleshoot and optimize complex ETL/ELT workflows ensuring data quality and availability.
Automation: Develop Python-based scripts and tools to automate deployments workflows and system maintenance.
Advanced Support: Collaborate with Engineering Product and Customer Success to resolve complex operational issues and ensure seamless data delivery.
Documentation: Build and maintain detailed operational runbooks incident playbooks and system guides.
Collaboration: Work in an Agile environment with engineers product managers and data scientists to operationalize analytics for Life Sciences & Healthcare datasets.
Bachelors or Masters in Computer Science Engineering or related field.
5 years in Technical Operations DevOps or SRE with focus on data platforms.
Proven experience managing enterprise-grade data services (Data Pipelines Data Lakes Warehouses).
Expert Python skills for automation and operational tooling.
Strong cloud experience (AWS or GCP) including compute storage databases containerization orchestration.
SQL proficiency with BigQuery Redshift or Snowflake.
Deep knowledge of ETL/ELT best practices data governance and compliance.
Ability to diagnose complex distributed systems issues with strong RCA skills.
Excellent communication (verbal & written) and experience creating technical documentation.
Collaborative proactive mindset with strong ownership.
Ability to work with stakeholders in EST timezone.
Experience in regulated industries (healthcare finance) and compliance (HIPAA).
Experience in Life Sciences / Healthcare data domain.
Knowledge of MLOps and deploying AI/ML models.
Familiarity with data visualization tools (Looker PowerBI etc.).
Experience with Elasticsearch or search technologies.
Understanding of ML frameworks (TensorFlow PyTorch Scikit-learn MLFlow).
Contractor agreement with payment in USD
100% remote work
Argentinas public holidays
English classes
Referral program
Access to learning platforms
Full Time