We are looking for a Data Engineer to be responsible for building and maintaining the infrastructure that supports the organizations data architecture.
The role involves creating and managing data pipelines using Airflow for data extraction processing and loading ensuring their maintenance monitoring and stability.
The engineer will work closely with data analysts and endusers to provide accessible and reliable data
Main Tasks and Responsibilities:
- Responsible for maintaining the infrastructure that supports the current data architecture
- Responsible for creating data pipelines in Airflow for data extracting processing and loading
- Responsible for data pipelines maintenance monitoring and stability
- Responsible for providing data access to data analysts and endusers
- Responsible for DevOps infrastructure
- Responsible for deploying Airflow dags to production environment using DevOps tools
- Responsible for code and query optimization
- Responsible for code review
- Responsible for improving the current data architecture and DevOps processes
- Responsible for delivering data in useful and appealing ways to users
- Responsible for performing and documenting analysis review and study on specified regulatory topics
- Responsible for understanding business change and requirement needs assess the impact and the cost
Qualifications :
Technical skills:
- Advanced Python(Mandatory)
- Experience in creating APIs in Python At least Flask (Mandatory)
- Experience in documenting and testing in Python (Mandatory)
- Advanced SQL skills and relational database management (Oracle is Mandatory SQL server is desirable PostgreSQL is desirable )
- Experience with Data Warehouses
- Hadoop ecosystem HDFS Yarn (Mandatory)
- Spark Environment Architecture (Mandatory)
- Advanced PySpark (Mandatory)
- Experience in creating and maintaining distributed environments using Hadoop and Spark
- Data Lakes Experience in organizing and maintaining data lakes (Mandatory) S3 is preferred
- Experience with Parquet file format is mandatory Avro is a plus
- Apache Airflow Experience in both pipeline development and deploying Airflow in distributed environment (Mandatory)
- Containerization Docker is Mandatory
- Kubernetes (Mandatory)
- Apache Kafka (Mandatory)
- Experience in automating applications deployment using DevOps tools Jenkins is Mandatory Ansible is a plus
- Agile methodologies (At least SCRUM)
Language Skills:
- Fluent in English (Mandatory)
Remote Work :
No
Employment Type :
Fulltime