NCS is a leading technology services firm that operates across the Asia Pacific region in over 20 cities providing consulting digital services technology solutions and more. We believe in harnessing the power of technology to achieve extraordinary things creating lasting value and impact for our communities partners and people. Our diverse workforce of 13000 has delivered large-scale mission-critical and multi-platform projects for governments and enterprises in Singapore and the APAC region.
As a Senior Data Engineer you are responsible for designing building and maintaining the data infrastructure that enables efficient data collection transformation storage and access across systems. The role ensures that data pipelines are scalable secure and optimized to support analytics machine learning and business intelligence use cases. The Data Engineer collaborates closely with data scientists analysts software engineers and architects to deliver reliable and high-quality data solutions.
What you will do:
- Design and implement scalable data pipelines to ingest transform and deliver data across platforms
- Build and maintain data lakes data warehouses and analytical data stores
- Ensure data quality security and consistency through validation monitoring and automated checks
- Collaborate with data scientists and analysts to make data accessible and analysis-ready
- Optimize data workflows for performance reliability and cost-efficiency
- Implement data governance practices such as metadata management cataloging and lineage tracking
- Troubleshoot data pipeline issues and perform root cause analysis
- Document data flows data definitions and engineering processes
The ideal candidate should possess:
- Proficient in data pipeline development using Python SQL or Scala
- Experience with ETL/ELT frameworks and orchestration tools (e.g. Apache Airflow AWS Glue dbt)
- Familiarity with data warehousing and big data technologies (e.g. Snowflake Redshift BigQuery Databricks Hadoop Spark)
- Strong understanding of data modelling partitioning indexing and schema design
- Knowledge of cloud platforms (AWS Azure GCP) and their native data services
- Experience with batch and real-time data streaming (e.g. Kafka Kinesis)
- Proficiency in data governance data quality and data lineage practices
- Familiarity with DevOps CI/CD version control and infrastructure as code (e.g. Terraform)
We are driven by our AEIOU beliefsAdventure Excellence Integrity Ownership and Unityand we seek individuals who embody these values in both their professional and personal lives. We are committed to our Impact: Valuing our clients Growing our people and Creating our future.
Together we make the extraordinary happen.
Learn more about us at and visit our LinkedIn career site.
Required Experience:
Senior IC
NCS is a leading technology services firm that operates across the Asia Pacific region in over 20 cities providing consulting digital services technology solutions and more. We believe in harnessing the power of technology to achieve extraordinary things creating lasting value and impact for our com...
NCS is a leading technology services firm that operates across the Asia Pacific region in over 20 cities providing consulting digital services technology solutions and more. We believe in harnessing the power of technology to achieve extraordinary things creating lasting value and impact for our communities partners and people. Our diverse workforce of 13000 has delivered large-scale mission-critical and multi-platform projects for governments and enterprises in Singapore and the APAC region.
As a Senior Data Engineer you are responsible for designing building and maintaining the data infrastructure that enables efficient data collection transformation storage and access across systems. The role ensures that data pipelines are scalable secure and optimized to support analytics machine learning and business intelligence use cases. The Data Engineer collaborates closely with data scientists analysts software engineers and architects to deliver reliable and high-quality data solutions.
What you will do:
- Design and implement scalable data pipelines to ingest transform and deliver data across platforms
- Build and maintain data lakes data warehouses and analytical data stores
- Ensure data quality security and consistency through validation monitoring and automated checks
- Collaborate with data scientists and analysts to make data accessible and analysis-ready
- Optimize data workflows for performance reliability and cost-efficiency
- Implement data governance practices such as metadata management cataloging and lineage tracking
- Troubleshoot data pipeline issues and perform root cause analysis
- Document data flows data definitions and engineering processes
The ideal candidate should possess:
- Proficient in data pipeline development using Python SQL or Scala
- Experience with ETL/ELT frameworks and orchestration tools (e.g. Apache Airflow AWS Glue dbt)
- Familiarity with data warehousing and big data technologies (e.g. Snowflake Redshift BigQuery Databricks Hadoop Spark)
- Strong understanding of data modelling partitioning indexing and schema design
- Knowledge of cloud platforms (AWS Azure GCP) and their native data services
- Experience with batch and real-time data streaming (e.g. Kafka Kinesis)
- Proficiency in data governance data quality and data lineage practices
- Familiarity with DevOps CI/CD version control and infrastructure as code (e.g. Terraform)
We are driven by our AEIOU beliefsAdventure Excellence Integrity Ownership and Unityand we seek individuals who embody these values in both their professional and personal lives. We are committed to our Impact: Valuing our clients Growing our people and Creating our future.
Together we make the extraordinary happen.
Learn more about us at and visit our LinkedIn career site.
Required Experience:
Senior IC
View more
View less