Important Information
Location: Costa Rica Colombia Bolivia Per y Argentina
Work Mode: Remote
Job Summary
As a Senior Data Engineer (16579) you will play a key role in building and maintaining our data stack from ingestion to modeling to warehouse optimization. You will work closely with data scientists backend engineers and stakeholders to ensure that our data is clean reliable and actionable.
Responsibilities and Duties
- Build and maintain scalable pipelines for ingesting and transforming data from diverse sources.
- Collaborate with product data science and engineering to define and prioritize data requirements.
- Implement robust ETL/ELT workflows using Python SQL and workflow orchestration tools.
- Design and evolve data models in relational (PostgreSQL) and non-relational environments.
- Maintain and optimize our cloud-based data warehouse (e.g. Redshift Snowflake or equivalent).
- Implement data quality checks monitoring and validation logic to ensure trusted outputs.
- Help productionize ML model inputs/outputs and support reproducible experimentation.
- Monitor and improve performance of data infrastructure with observability and alerting.
- Ensure secure handling of sensitive data and compliance with relevant data policies.
- Document data flows architecture and transformation logic for team transparency.
Qualifications and Skills
- Bachelors degree in computer science software engineering or a related field.
- Have 5 years of data engineering experience in production environments.
- Are proficient in Python and SQL with experience in workflow orchestration (Step Functions Airflow Dagster etc).
- Have hands-on experience with Spark dbt or similar tools for scalable transformations.
- Are comfortable with cloud-native tooling especially AWS services like S3 Lambda RDS and Glue.
- Understand how to model and structure data for analytics forecasting and reporting.
- Care deeply about data quality testing and long-term maintainability of pipelines.
- Can work independently on scoped projects and communicate clearly across time zones.
- Design build and manage data pipelines and ETL workflows.
- Develop and maintain core data models schemas and warehouse structures.
- Optimize data systems for scalability reliability and performance.
- Support downstream consumers including ML pipelines dashboards and internal APIs.
.
Additional Requirements
- Experience working in a modern data stack with tools like dbt Snowflake or Fivetran.
- Exposure to production ML workflows or feature engineering pipelines.
- Familiarity with infrastructure-as-code tools (Terraform Pulumi) or containerized environments (Docker).
- Background in real estate pricing or time-series forecasting.
About Encora
Encora is a global company that offers Software and Digital Engineering solutions. Our practices include Cloud Services Product Engineering & Application Modernization Data & Analytics Digital Experience & Design Services DevSecOps Cybersecurity Quality Engineering AI & LLM Engineering among others.
At Encora we hire professionals based solely on their skills and do not discriminate based on age disability religion gender sexual orientation socioeconomic status or nationality.
Required Experience:
Senior IC