Data Engineer

GM2

Not Interested
Bookmark
Report This Job

profile Job Location:

Buenos Aires - Argentina

profile Monthly Salary: Not Disclosed
Posted on: 30+ days ago
Vacancies: 1 Vacancy

Job Summary

  • You will demonstrate sound engineering principles and a good understanding of modern CI/CD toolsets.
  • Design and build data extraction loading and transformation pipelines and data products across on-prem and cloud platforms.
  • Perform application impact assessments requirements reviews and develop work estimates.
  • Develop test strategies and site reliability engineering measures for data products & solutions.
  • Lead resolution of critical operations issues including post-implementation reviews.
  • Perform technical data stewardship tasks including metadata management security and privacy by design.
  • Being action-oriented ensuring accountability driving results being collaborative and communicating.
  • effectively will be essential Research construct develop and test complex ETL pipeline.
  • Apply automation and monitoring tools using a test-driven approach and mindset.
  • Develop and maintain scalable data infrastructure that provides an outstanding experience for internal stakeholders and enables consistent value extraction from data Innovate data ingestion and processing methods to support strategic priorities and facilitate decision-making throughout the organization.
  • Build and scale data infrastructure that powers batch and real-time data processing of thousands of records daily.
  • Interface with data scientists analysts product managers and all data stakeholders to understand their needs and promote best practices.
  • Architect data pipelines that provide fast optimized and robust end-to-end solutions.
  • Drive and maintain a culture of quality innovation and experimentation.


Requirements

  • 6 years Data Engineering experience.
  • 4 years data platform solution architecture and design.
  • You are a collaborative creative open-minded individual who possesses a natural curiosity and desire to experiment.
  • Proficiency in:
  • Programming language experience (java/python - including low code tools such as Informatica Datastage or equivalent).
  • SQL and database proficiency (min 3 years).
  • Task automation - setting up DAGS in Control-M Apache Airflow Prefect etc.
  • Unix scripting.
  • Quality Assurance & Site Reliability Engineering.
  • Infrastructure as code (e.g. Terraform Pulumi Puppet Ansible).
  • Data Modeling.
  • Working Knowledge of ML Ops.


You will demonstrate sound engineering principles and a good understanding of modern CI/CD toolsets.Design and build data extraction loading and transformation pipelines and data products across on-prem and cloud platforms.Perform application impact assessments requirements reviews and develop work ...
View more view more

Company Industry

IT Services and IT Consulting

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala