- Design develop and maintain scalable ETL/ELT data pipelines for AI and analytics applications.
- Optimize data architectures and storage solutions using Databricks Snowflake and cloud-based platforms.
- Develop big data processing jobs using PySpark Spark SQL and distributed computing frameworks.
- Ensure data quality governance and security best practices across all environments.
- Implement CI/CD workflows for automated deployment and infrastructure as code (IaC).
- Collaborate with cross-functional teams (data scientists software engineers analysts) to build end-to-end data solutions.
- Lead troubleshooting and performance tuning efforts for high-volume data processing systems.
- Develop and maintain Python-based backend services to support data infrastructure.
- Implement Apache Airflow Dataplane or similar orchestration tools to automate and monitor workflows.
Qualifications :
- Strong proficiency in SQL for data processing and transformation.
- Strong knowledge of object-oriented programming in at least one language (Python Scala or Java).
- Hands-on experience deploying and managing large-scale data pipelines in production environments.
- Expertise in workflow orchestration tools like Apache Airflow Dataplane or equivalent.
- Deep understanding of cloud-based data platforms such as Databricks and Snowflake (Databricks preferred)
- Knowledge of CI/CD pipelines and infrastructure as code for data workflows.
- Familiarity with cloud environments (AWS preferred Azure or GCP) and cloud-native data processing.
- Expertise in Spark PySpark and Spark SQL with a solid understanding of distributed computing frameworks.
- Proven ability to lead projects and mentor junior engineers in a fast-paced collaborative environment.
What about languages
Excellent written and verbal English for clear and effective communication is a must!
How much experience must I have
Were looking for someone with 4 years of experience working as a Data Engineer or related positions.
Additional Information :
Our Perks and Benefits:
Learning Opportunities:
- Certifications in AWS (we are AWS Partners) Databricks and Snowflake.
- Access to AI learning paths to stay up to date with the latest technologies.
- Study plans courses and additional certifications tailored to your role.
- English lessons to support your professional communication.
Mentoring and Development:
- Career development plans and mentorship programs to help shape your path.
Celebrations & Support:
- Anniversary and birthday gifts.
- Company-provided equipment for remote work.
Regional Flexibility:
- Other benefits may vary according to your location in LATAM.
Remote Work :
Yes
Employment Type :
Full-time