We are seeking a skilled Data Engineer to join our dynamic this role you will be responsible for designing building and maintaining robust data infrastructure that supports our organizations data-driven decision-making processes. You will work with cutting-edge technologies to ensure efficient data flow and accessibility for analytical and machine learning applications.
Job Purpose
The primary purpose of this role is to develop and maintain scalable reliable data infrastructure that enables efficient data collection storage processing and accessibility for analysis and machine learning applications supporting data-driven decision making across the organization.
Job Duties and Responsibilities
- Data Infrastructure Design
- Data Collection & Storage
- Python Programming
- SQL Development
- Scala Programming
- Apache Spark Implementation
- Kafka Integration
- Airflow Workflow Management
- AWS Cloud Platform
- GCP Cloud Platform
- Data Pipeline Development
- ETL/ELT Processes
- Data Warehousing
- Data Processing Optimization
- System Reliability Maintenance
Qualifications :
Required Qualifications
- Python Proficiency
- SQL Expertise
- Scala Knowledge
- Apache Spark Experience
- Kafka Experience
- Airflow Experience
- AWS Cloud Platform
- GCP Cloud Platform
- Data Infrastructure Design
- Data Pipeline Development
- Database Systems Understanding
- Data Warehousing Knowledge
- ETL/ELT Process Experience
- Cloud Platform Experience
- Programming Language Proficiency
Remote Work :
No
Employment Type :
Full-time
We are seeking a skilled Data Engineer to join our dynamic this role you will be responsible for designing building and maintaining robust data infrastructure that supports our organizations data-driven decision-making processes. You will work with cutting-edge technologies to e...
We are seeking a skilled Data Engineer to join our dynamic this role you will be responsible for designing building and maintaining robust data infrastructure that supports our organizations data-driven decision-making processes. You will work with cutting-edge technologies to ensure efficient data flow and accessibility for analytical and machine learning applications.
Job Purpose
The primary purpose of this role is to develop and maintain scalable reliable data infrastructure that enables efficient data collection storage processing and accessibility for analysis and machine learning applications supporting data-driven decision making across the organization.
Job Duties and Responsibilities
- Data Infrastructure Design
- Data Collection & Storage
- Python Programming
- SQL Development
- Scala Programming
- Apache Spark Implementation
- Kafka Integration
- Airflow Workflow Management
- AWS Cloud Platform
- GCP Cloud Platform
- Data Pipeline Development
- ETL/ELT Processes
- Data Warehousing
- Data Processing Optimization
- System Reliability Maintenance
Qualifications :
Required Qualifications
- Python Proficiency
- SQL Expertise
- Scala Knowledge
- Apache Spark Experience
- Kafka Experience
- Airflow Experience
- AWS Cloud Platform
- GCP Cloud Platform
- Data Infrastructure Design
- Data Pipeline Development
- Database Systems Understanding
- Data Warehousing Knowledge
- ETL/ELT Process Experience
- Cloud Platform Experience
- Programming Language Proficiency
Remote Work :
No
Employment Type :
Full-time
اعرض المزيد
عرض أقل