Job Description
Role Overview
We are looking for a skilled and driven Big Data Developer to join our clients Global Data Platform this role you will design build and optimize scalable data platforms that support enterprise analytics data science and business intelligence initiatives. You will work closely with cross-functional stakeholders to translate business needs into robust data solutions while ensuring performance scalability and reliability.
Key Responsibilities:
Data Engineering & Platform Development
- Design develop and maintain scalable big data pipelines and platforms.
- Build and optimize data ingestion transformation and storage solutions across distributed systems.
- Develop and manage data workflows using modern big data technologies.
Requirements Translation & Solution Design
- Collaborate with business and technical stakeholders to gather requirements.
- Translate business requirements into Functional Specification Documents (FSD) and technical solutions.
- Ensure alignment between data architecture and business objectives.
Engineering Excellence & Best Practices
- Implement and promote best practices such as Test-Driven Development (TDD) and Continuous Integration/Continuous Deployment (CI/CD).
- Enforce coding standards data governance security and performance optimization.
- Conduct code reviews and contribute to continuous improvement initiatives.
Operational Stability & Delivery
- Monitor and maintain production systems to ensure high availability and reliability.
- Proactively identify risks issues and bottlenecks and drive resolution with stakeholders.
- Provide timely updates on project progress and delivery milestones.
Collaboration & Stakeholder Management
- Work closely with data scientists analysts architects and IT teams.
- Communicate effectively with both technical and non-technical stakeholders.
- Support the organizations transition toward a data-driven culture.
Requirements:
Technical Skills
- Bachelors degree in Computer Science Information Technology or related discipline.
- Minimum 5 years of experience in big data engineering or data platform development.
- Strong hands-on experience with big data technologies such as:
- HDFS Trino AWS S3 (or equivalent object storage)
- Proficiency in Python for data engineering and scripting.
- Solid experience with SQL (MSSQL Oracle PostgreSQL).
- Experience with CI/CD pipelines code quality and vulnerability management.
- Familiarity with Linux/Unix environments and shell scripting.
- Strong understanding of:
- Data architecture and data modeling
- Distributed systems and data processing frameworks
- Experience with development tools and IDEs.
Professional Skills
- Strong understanding of Software Development Life Cycle (SDLC) and Agile methodologies.
- Experience with version control systems (e.g. Git).
- Excellent analytical and problem-solving abilities.
- Ability to manage complex technical challenges and deliver scalable solutions.
Preferred / Nice-to-Have
- Experience with cloud platforms (e.g. AWS Azure GCP).
- Exposure to data governance data quality frameworks and metadata management.
- Experience working in global or regional teams.
- Familiarity with data visualization or BI tools.
Required Experience:
IC
Job DescriptionRole OverviewWe are looking for a skilled and driven Big Data Developer to join our clients Global Data Platform this role you will design build and optimize scalable data platforms that support enterprise analytics data science and business intelligence initiatives. You will work cl...
Job Description
Role Overview
We are looking for a skilled and driven Big Data Developer to join our clients Global Data Platform this role you will design build and optimize scalable data platforms that support enterprise analytics data science and business intelligence initiatives. You will work closely with cross-functional stakeholders to translate business needs into robust data solutions while ensuring performance scalability and reliability.
Key Responsibilities:
Data Engineering & Platform Development
- Design develop and maintain scalable big data pipelines and platforms.
- Build and optimize data ingestion transformation and storage solutions across distributed systems.
- Develop and manage data workflows using modern big data technologies.
Requirements Translation & Solution Design
- Collaborate with business and technical stakeholders to gather requirements.
- Translate business requirements into Functional Specification Documents (FSD) and technical solutions.
- Ensure alignment between data architecture and business objectives.
Engineering Excellence & Best Practices
- Implement and promote best practices such as Test-Driven Development (TDD) and Continuous Integration/Continuous Deployment (CI/CD).
- Enforce coding standards data governance security and performance optimization.
- Conduct code reviews and contribute to continuous improvement initiatives.
Operational Stability & Delivery
- Monitor and maintain production systems to ensure high availability and reliability.
- Proactively identify risks issues and bottlenecks and drive resolution with stakeholders.
- Provide timely updates on project progress and delivery milestones.
Collaboration & Stakeholder Management
- Work closely with data scientists analysts architects and IT teams.
- Communicate effectively with both technical and non-technical stakeholders.
- Support the organizations transition toward a data-driven culture.
Requirements:
Technical Skills
- Bachelors degree in Computer Science Information Technology or related discipline.
- Minimum 5 years of experience in big data engineering or data platform development.
- Strong hands-on experience with big data technologies such as:
- HDFS Trino AWS S3 (or equivalent object storage)
- Proficiency in Python for data engineering and scripting.
- Solid experience with SQL (MSSQL Oracle PostgreSQL).
- Experience with CI/CD pipelines code quality and vulnerability management.
- Familiarity with Linux/Unix environments and shell scripting.
- Strong understanding of:
- Data architecture and data modeling
- Distributed systems and data processing frameworks
- Experience with development tools and IDEs.
Professional Skills
- Strong understanding of Software Development Life Cycle (SDLC) and Agile methodologies.
- Experience with version control systems (e.g. Git).
- Excellent analytical and problem-solving abilities.
- Ability to manage complex technical challenges and deliver scalable solutions.
Preferred / Nice-to-Have
- Experience with cloud platforms (e.g. AWS Azure GCP).
- Exposure to data governance data quality frameworks and metadata management.
- Experience working in global or regional teams.
- Familiarity with data visualization or BI tools.
Required Experience:
IC
View more
View less