Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailNot Disclosed
Salary Not Disclosed
1 Vacancy
Were looking for a skilled Data Engineer with strong experience in Python Terraform and AWS to help design and implement scalable data architectures for AI and analytics applications. This role is part of a strategic initiative to enable robust data pipelines infrastructure automation and cloud-based analytics capabilities for enterprise clients.
In this role youll work closely with Blends internal teams and client stakeholders to build and optimize ETL/ELT workflows develop backend services and ensure data quality governance and security across all environments.
This position is ideal for engineers with a strong data foundation who are looking to apply their skills in modern cloud ecosystems big data processing and infrastructure as code while gaining exposure to tools like Databricks and Snowflake.
You will:
Design develop and maintain scalable ETL/ELT data pipelines for AI and analytics applications.
Optimize data architectures and storage solutions using Databricks Snowflake and cloud-based platforms.
Develop big data processing jobs using PySpark Spark SQL and distributed computing frameworks.
Ensure data quality governance and security best practices across all environments.
Implement CI/CD workflows for automated deployment and Infrastructure as Code (IaC).
Collaborate with cross-functional teams (data scientists software engineers analysts) to build end-to-end data solutions.
Lead troubleshooting and performance tuning efforts for data processing systems.
Develop and maintain Python-based backend services to support data infrastructure.
Qualifications :
4 years of experience in data engineering.
Strong proficiency in Python for building data-driven solutions.
Hands-on experience with Terraform for provisioning and managing cloud infrastructure.
Strong experience with Infrastructure as Code (IaC) principles and CI/CD pipelines (Terraform GitHub Actions).
Experience working in AWS cloud environments (infrastructure and services oriented to data solutions).
Hands-on experience with data processing using Pandas or PySpark.
Deep understanding of cloud-based data platforms such as Databricks and Snowflake (Databricks preferred).
Strong communication and documentation skills.
Ability to thrive in collaborative client-facing environments.
Preferred Qualifications
SQL experience.
Hands-on work with Databricks.
Familiarity with modern CI/CD practices and tooling.
Experience in consultancy or client-service environments.
Language Requirements
You must have excellent written and verbal English communication skills.
Additional Information :
Our Perks and Benefits:
Learning Opportunities:
Mentoring and Development:
Career development plans and mentorship programs to help shape your path.
Celebrations & Support:
Flexible working options to help you strike the right balance.
Other benefits may vary according to your location in LATAM. For detailed information regarding the benefits applicable to your specific location please consult with one of our recruiters.
Remote Work :
Yes
Employment Type :
Full-time
Remote