Job Title:
Big Data Platform Developer
Location:
Qatar
Industry:
Technology / Data Engineering / Digital Transformation
Employment Type:
Full-Time
Job Summary:
We are seeking a skilled and motivated Big Data Platform Developer with 4–6 years of experience in designing developing and maintaining scalable data platforms and distributed processing systems. The ideal candidate should have strong expertise in big data technologies cloud environments and data pipeline development with the ability to work on high-volume high-performance data solutions.
The candidate will be responsible for building robust data architectures integrating multiple data sources optimizing processing frameworks and supporting reporting and analytics requirements across enterprise systems.
Key Responsibilities:
- Design develop and maintain scalable big data platforms and distributed data processing systems.
- Build and optimize ETL/ELT pipelines for structured and unstructured data.
- Develop data ingestion and streaming solutions using technologies such as Hadoop Spark and Kafka.
- Work with SQL and NoSQL databases for data storage retrieval and analytics.
- Develop backend services and APIs using Python and Java.
- Integrate data from multiple formats including JSON CSV and XML.
- Design and maintain REST APIs and microservices-based architectures.
- Monitor and optimize data workflows system performance and platform scalability.
- Collaborate with data analysts data scientists and business stakeholders to support reporting and dashboard requirements.
- Implement automation pipelines for deployment monitoring and operational efficiency.
- Ensure data quality integrity security and governance standards are maintained.
- Support cloud-based infrastructure and distributed computing environments.
- Troubleshoot and resolve platform integration and performance-related issues.
Required Qualifications:
- Bachelor’s degree in:
- Computer Science
- Software Engineering
- Mathematics
- Physics
- Data Science
- or related field
- 4–6 years of hands-on experience in big data platform development or data engineering.
Required Technical Skills:
Programming Languages:
Big Data Technologies:
- Hadoop
- Apache Spark
- Apache Kafka
Databases:
- SQL Databases
- NoSQL Databases
Data Formats:
Other Technical Skills:
- REST APIs
- Microservices Architecture
- Cloud Platforms
- Distributed Systems
- Automation Pipelines
- Reporting & Dashboard Support
Preferred Skills:
- Experience with cloud-native big data platforms.
- Knowledge of CI/CD and DevOps practices.
- Familiarity with containerization technologies such as Docker or Kubernetes.
- Exposure to data governance and security standards.
- Strong analytical and problem-solving abilities.
Documents / Evaluation Criteria:
Candidates should be prepared to provide:
- Updated CV/Resume
- Code samples or project evidence
- Examples of data platform implementations
- ETL/Data pipeline development experience
- Reporting/dashboard project exposure
Soft Skills:
- Strong communication and collaboration skills
- Ability to work in fast-paced environments
- Problem-solving and analytical mindset
- Attention to detail and quality-focused approach
- Ability to work independently and within cross-functional teams
Job Title:Big Data Platform DeveloperLocation:QatarIndustry:Technology / Data Engineering / Digital TransformationEmployment Type:Full-TimeJob Summary:We are seeking a skilled and motivated Big Data Platform Developer with 4–6 years of experience in designing developing and maintaining scalable data...
Job Title:
Big Data Platform Developer
Location:
Qatar
Industry:
Technology / Data Engineering / Digital Transformation
Employment Type:
Full-Time
Job Summary:
We are seeking a skilled and motivated Big Data Platform Developer with 4–6 years of experience in designing developing and maintaining scalable data platforms and distributed processing systems. The ideal candidate should have strong expertise in big data technologies cloud environments and data pipeline development with the ability to work on high-volume high-performance data solutions.
The candidate will be responsible for building robust data architectures integrating multiple data sources optimizing processing frameworks and supporting reporting and analytics requirements across enterprise systems.
Key Responsibilities:
- Design develop and maintain scalable big data platforms and distributed data processing systems.
- Build and optimize ETL/ELT pipelines for structured and unstructured data.
- Develop data ingestion and streaming solutions using technologies such as Hadoop Spark and Kafka.
- Work with SQL and NoSQL databases for data storage retrieval and analytics.
- Develop backend services and APIs using Python and Java.
- Integrate data from multiple formats including JSON CSV and XML.
- Design and maintain REST APIs and microservices-based architectures.
- Monitor and optimize data workflows system performance and platform scalability.
- Collaborate with data analysts data scientists and business stakeholders to support reporting and dashboard requirements.
- Implement automation pipelines for deployment monitoring and operational efficiency.
- Ensure data quality integrity security and governance standards are maintained.
- Support cloud-based infrastructure and distributed computing environments.
- Troubleshoot and resolve platform integration and performance-related issues.
Required Qualifications:
- Bachelor’s degree in:
- Computer Science
- Software Engineering
- Mathematics
- Physics
- Data Science
- or related field
- 4–6 years of hands-on experience in big data platform development or data engineering.
Required Technical Skills:
Programming Languages:
Big Data Technologies:
- Hadoop
- Apache Spark
- Apache Kafka
Databases:
- SQL Databases
- NoSQL Databases
Data Formats:
Other Technical Skills:
- REST APIs
- Microservices Architecture
- Cloud Platforms
- Distributed Systems
- Automation Pipelines
- Reporting & Dashboard Support
Preferred Skills:
- Experience with cloud-native big data platforms.
- Knowledge of CI/CD and DevOps practices.
- Familiarity with containerization technologies such as Docker or Kubernetes.
- Exposure to data governance and security standards.
- Strong analytical and problem-solving abilities.
Documents / Evaluation Criteria:
Candidates should be prepared to provide:
- Updated CV/Resume
- Code samples or project evidence
- Examples of data platform implementations
- ETL/Data pipeline development experience
- Reporting/dashboard project exposure
Soft Skills:
- Strong communication and collaboration skills
- Ability to work in fast-paced environments
- Problem-solving and analytical mindset
- Attention to detail and quality-focused approach
- Ability to work independently and within cross-functional teams
اعرض المزيد
عرض أقل