Role Overview
We are seeking a seasoned Data Consultant to serve as the operational backbone of our Data Quality (DQ) this role you will take full ownership of our Data Quality ecosystem managing both the platforms health (Day 2 Operations) and the integrity of the data residing within it.
In this role you will lead the engineering and operations for a major Customer 360 Data Lakehouse. You will sit on the client side acting as the bridge between the Singapore business and our global delivery team. You are not just overseeing delivery; you are the Lead Data Architect responsible for the solutions integrity stability and evolution
We need a leader who can walk into a boardroom to discuss Data Strategy with Group Heads and then immediately sit with the engineering team to debug complex Kubernetes/Airflow ingestion issues or optimize Snowflake clustering keys.
Key Responsibilities
1) Platform Technical Architecture (The Design)
You are the Technical Design Authority for the account. You ensure the solution is architected for scale performance and future readiness.
- Solution Architecture: Own the roadmap for the C360 Data Lakehouse. You will drive the evolution of the data model ensuring it supports advanced use cases like GenAI (Cortex) and CDC streaming.
- Snowflake Engineering: Act as the Lead Snowflake Architect. You will personally oversee the optimization of micro-partitions clustering keys and virtual warehouse sizing to ensure high performance.
- Ingestion Modernization: Lead the architectural improvements of the Data Pipes ingestion layer. You will guide the engineering team on re-architecting DAGs in Apache Airflow and optimizing Kubernetes resource allocation.
- Technical Quality: Enforce a Zero-Defect mentality. You are responsible for the architectural review of all code before it reaches production.
2) Snowflake Platform Operations & Management (The Run)
You serve as the ultimate Technical Authority for this account ensuring the proposed solution is designed for maximum scale optimal performance and long-term viability.
- DQ Framework Administration: Manage the execution of Data Quality checks within Snowflake (using Tasks Streams or dbt). Ensure validation rules are running on schedule and utilizing compute resources efficiently.
- Day 2 Operations: Oversee the health of the Snowflake environment. This includes managing Virtual Warehouses (scaling/resizing) monitoring credit consumption managing Role-Based Access Control (RBAC) and optimizing storage costs.
- Observability & Monitoring: Implement and maintain monitoring dashboards (using SnowSight Resource Monitors or external tools like Datadog) to track pipeline latency failed tasks and warehouse load.
- Performance Tuning: Analyze Query Profiles to identify slow-running DQ scans and optimize them via clustering keys or materialized views.
3) Incident Investigation & Data Triage (The Fix)
You act as the operations lead for the Data Lakehouse. When data integrity is compromised you lead the rapid response triage and remediation efforts to restore trust and stability to the platform using advanced diagnostic tools.
- First-Responder for Data Issues: Act as the first line of defense when DQ thresholds are breached. Triage alerts regarding schema drift null values duplication and freshness violations within Snowflake tables.
- Root Cause Analysis (RCA): Deep dive into data issues using Snowflake and the Data Quality Framework to trace when and how data corruption occurred.
- Remediation: Apply fixes to SQL pipelines or coordinate with upstream teams. Use Zero-Copy Cloning to create safe sandbox environments for testing data fixes before applying them to production.
- False Positive Reduction: Continuously tune DQ thresholds (e.g. standard deviation variances) to minimize alert fatigue and ensure high-fidelity notifications.
- Experience in Snowflake Cortex: Will need to work with Snowflake Cortex for AI enabled search and agents to help diagnose issues
4) Process Engineering & Automation
You champion efficiency by transforming manual operational tasks into automated scalable processes. You are responsible for closing the loop between operations and architecture ensuring that every incident drives systemic improvement and reduces future toil.
- Automate Operational Toil: Use Snowpark (Python) or Stored Procedures to automate repetitive maintenance tasks (e.g. archiving old tables granting permissions generating DQ reports).
- Incident Management Frameworks: Define and execute the playbook for Data Incidents. Ensure all data outages are logged with clear severity levels and post-mortem documentation.
- Feedback Loops: Create a feedback loop between the Operations team and Data Architects to ensure that frequent data issues are addressed at the architectural level.
5) Stakeholder & Client Engagement
You serve as the bridge between technical complexity and business value. You cultivate trust by providing transparent visibility into data health and empowering business teams to understand govern and utilize their data assets effectively.
- Trust & Transparency: Publish regular Data Quality Reports (via Snowsight Dashboards or BI tools) to business stakeholders.
- Advisory: Consult with Data Stewards to translate functional business rules into technical SQL checks or Snowflake Data Metric Functions.
- Training: Mentor internal teams on Data Quality Principle and Practices.
Qualifications :
Required Skills & Experience
1) Architecture / Consulting Experience.
- Total Professional Experience: 12 Years in Data Engineering Software Development or Solution Architecture.
- Specialist Experience: 8 Years specifically designing and managing large-scale Data Lakes or Data Warehousing solutions.
- Platform Expertise: 5 Years of deep hands-on experience with Snowflake and AWS including proven experience with complex migrations or rescue projects.
- Consulting Leadership: 4 Years in a client-facing technical advisory role leading squads or pods to deliver outcome-based solutions.
- Executive Presence: You can explain Kubernetes resource fragmentation to a CIO in terms of Business Risk and Reliability.
- Ownership: You view yourself as the owner of the outcome. You are comfortable navigating ambiguity and turning distressed engagements into referenceable success stories.
2) Core Technical Skills (Snowflake & DataOps)
- Advanced Snowflake SQL: Expert-level mastery of Snowflake SQL including Window Functions UDFs and Stored Procedures.
- Apache Airflow : Strong experience with Apache Airflow and orchestrating snowflake tasks/Dbt jobs via Kubernetes pods
- Experience with SnowPark: for building scalable pipelines and implementing custom functions especially for advanced data quality checks and transformations in Snowflake.
- Data Quality Tools: Experience running DQ frameworks on Snowflake (e.g. dbt tests Great Expectations or Snowflake Native Data Quality functions).
- Strong Experience in Python: Specifically interacting with Snowflake via SnowPark or the Python Connector
3) Delivery & Soft Skills
- Investigative Mindset: You enjoy the detective work of tracing a data error back to its source.
- Communication: Ability to explain technical data failures to non-technical business stakeholders clearly and concisely.
Preferred Certifications
- Snowflake SnowPro Core Certification (Highly Desirable).
- Snowflake SnowPro Advanced: Data Engineer.
- AWS Cloud Certifications.
Additional Information :
BENEFITS & PERKS FOR WORKING AT OLLION
Our employees multiply their potential because they have opportunities to: Create a lasting Impact Learn and Grow professionally & personally Experience great Culture and Be your Whole Self!
Beyond an amazing collaborative work environment great people and inspiring innovative work we have some great benefits and perks:
- Benchmarked competitive in-market total rewards package including (but not limited to): base salary & short-term incentive for all employees
- Fully remote-first small but Global organization; learn wherever whenever frees our people from a rigid view of learning and growth
- Retirement planning (i.e. CPF EPF company-matched 401(k))
- Globally we build benefit plans that offer choices for whatever stage in life our employees are in and allow for flexibility as life happens. Employees have access to a fully comprehensive benefits package to choose the medical dental and vision insurance plan that best fits their addition to great healthcare coverage we also offer all employees mental health resources and additional wellness programs.
- Generous time off and leave allowances
- And more!
Ollion is an equal opportunity employer. We celebrate diversity and we are committed to creating an inclusive environment for all employees. Ollion does not discriminate in employment on the basis of race color religion sex (including pregnancy and gender identity) national origin political affiliation sexual orientation marital status disability genetic information age membership in an employee organization parental status military service or other non-merit factor.
Remote Work :
No
Employment Type :
Full-time
Role OverviewWe are seeking a seasoned Data Consultant to serve as the operational backbone of our Data Quality (DQ) this role you will take full ownership of our Data Quality ecosystem managing both the platforms health (Day 2 Operations) and the integrity of the data residing within it.In this ro...
Role Overview
We are seeking a seasoned Data Consultant to serve as the operational backbone of our Data Quality (DQ) this role you will take full ownership of our Data Quality ecosystem managing both the platforms health (Day 2 Operations) and the integrity of the data residing within it.
In this role you will lead the engineering and operations for a major Customer 360 Data Lakehouse. You will sit on the client side acting as the bridge between the Singapore business and our global delivery team. You are not just overseeing delivery; you are the Lead Data Architect responsible for the solutions integrity stability and evolution
We need a leader who can walk into a boardroom to discuss Data Strategy with Group Heads and then immediately sit with the engineering team to debug complex Kubernetes/Airflow ingestion issues or optimize Snowflake clustering keys.
Key Responsibilities
1) Platform Technical Architecture (The Design)
You are the Technical Design Authority for the account. You ensure the solution is architected for scale performance and future readiness.
- Solution Architecture: Own the roadmap for the C360 Data Lakehouse. You will drive the evolution of the data model ensuring it supports advanced use cases like GenAI (Cortex) and CDC streaming.
- Snowflake Engineering: Act as the Lead Snowflake Architect. You will personally oversee the optimization of micro-partitions clustering keys and virtual warehouse sizing to ensure high performance.
- Ingestion Modernization: Lead the architectural improvements of the Data Pipes ingestion layer. You will guide the engineering team on re-architecting DAGs in Apache Airflow and optimizing Kubernetes resource allocation.
- Technical Quality: Enforce a Zero-Defect mentality. You are responsible for the architectural review of all code before it reaches production.
2) Snowflake Platform Operations & Management (The Run)
You serve as the ultimate Technical Authority for this account ensuring the proposed solution is designed for maximum scale optimal performance and long-term viability.
- DQ Framework Administration: Manage the execution of Data Quality checks within Snowflake (using Tasks Streams or dbt). Ensure validation rules are running on schedule and utilizing compute resources efficiently.
- Day 2 Operations: Oversee the health of the Snowflake environment. This includes managing Virtual Warehouses (scaling/resizing) monitoring credit consumption managing Role-Based Access Control (RBAC) and optimizing storage costs.
- Observability & Monitoring: Implement and maintain monitoring dashboards (using SnowSight Resource Monitors or external tools like Datadog) to track pipeline latency failed tasks and warehouse load.
- Performance Tuning: Analyze Query Profiles to identify slow-running DQ scans and optimize them via clustering keys or materialized views.
3) Incident Investigation & Data Triage (The Fix)
You act as the operations lead for the Data Lakehouse. When data integrity is compromised you lead the rapid response triage and remediation efforts to restore trust and stability to the platform using advanced diagnostic tools.
- First-Responder for Data Issues: Act as the first line of defense when DQ thresholds are breached. Triage alerts regarding schema drift null values duplication and freshness violations within Snowflake tables.
- Root Cause Analysis (RCA): Deep dive into data issues using Snowflake and the Data Quality Framework to trace when and how data corruption occurred.
- Remediation: Apply fixes to SQL pipelines or coordinate with upstream teams. Use Zero-Copy Cloning to create safe sandbox environments for testing data fixes before applying them to production.
- False Positive Reduction: Continuously tune DQ thresholds (e.g. standard deviation variances) to minimize alert fatigue and ensure high-fidelity notifications.
- Experience in Snowflake Cortex: Will need to work with Snowflake Cortex for AI enabled search and agents to help diagnose issues
4) Process Engineering & Automation
You champion efficiency by transforming manual operational tasks into automated scalable processes. You are responsible for closing the loop between operations and architecture ensuring that every incident drives systemic improvement and reduces future toil.
- Automate Operational Toil: Use Snowpark (Python) or Stored Procedures to automate repetitive maintenance tasks (e.g. archiving old tables granting permissions generating DQ reports).
- Incident Management Frameworks: Define and execute the playbook for Data Incidents. Ensure all data outages are logged with clear severity levels and post-mortem documentation.
- Feedback Loops: Create a feedback loop between the Operations team and Data Architects to ensure that frequent data issues are addressed at the architectural level.
5) Stakeholder & Client Engagement
You serve as the bridge between technical complexity and business value. You cultivate trust by providing transparent visibility into data health and empowering business teams to understand govern and utilize their data assets effectively.
- Trust & Transparency: Publish regular Data Quality Reports (via Snowsight Dashboards or BI tools) to business stakeholders.
- Advisory: Consult with Data Stewards to translate functional business rules into technical SQL checks or Snowflake Data Metric Functions.
- Training: Mentor internal teams on Data Quality Principle and Practices.
Qualifications :
Required Skills & Experience
1) Architecture / Consulting Experience.
- Total Professional Experience: 12 Years in Data Engineering Software Development or Solution Architecture.
- Specialist Experience: 8 Years specifically designing and managing large-scale Data Lakes or Data Warehousing solutions.
- Platform Expertise: 5 Years of deep hands-on experience with Snowflake and AWS including proven experience with complex migrations or rescue projects.
- Consulting Leadership: 4 Years in a client-facing technical advisory role leading squads or pods to deliver outcome-based solutions.
- Executive Presence: You can explain Kubernetes resource fragmentation to a CIO in terms of Business Risk and Reliability.
- Ownership: You view yourself as the owner of the outcome. You are comfortable navigating ambiguity and turning distressed engagements into referenceable success stories.
2) Core Technical Skills (Snowflake & DataOps)
- Advanced Snowflake SQL: Expert-level mastery of Snowflake SQL including Window Functions UDFs and Stored Procedures.
- Apache Airflow : Strong experience with Apache Airflow and orchestrating snowflake tasks/Dbt jobs via Kubernetes pods
- Experience with SnowPark: for building scalable pipelines and implementing custom functions especially for advanced data quality checks and transformations in Snowflake.
- Data Quality Tools: Experience running DQ frameworks on Snowflake (e.g. dbt tests Great Expectations or Snowflake Native Data Quality functions).
- Strong Experience in Python: Specifically interacting with Snowflake via SnowPark or the Python Connector
3) Delivery & Soft Skills
- Investigative Mindset: You enjoy the detective work of tracing a data error back to its source.
- Communication: Ability to explain technical data failures to non-technical business stakeholders clearly and concisely.
Preferred Certifications
- Snowflake SnowPro Core Certification (Highly Desirable).
- Snowflake SnowPro Advanced: Data Engineer.
- AWS Cloud Certifications.
Additional Information :
BENEFITS & PERKS FOR WORKING AT OLLION
Our employees multiply their potential because they have opportunities to: Create a lasting Impact Learn and Grow professionally & personally Experience great Culture and Be your Whole Self!
Beyond an amazing collaborative work environment great people and inspiring innovative work we have some great benefits and perks:
- Benchmarked competitive in-market total rewards package including (but not limited to): base salary & short-term incentive for all employees
- Fully remote-first small but Global organization; learn wherever whenever frees our people from a rigid view of learning and growth
- Retirement planning (i.e. CPF EPF company-matched 401(k))
- Globally we build benefit plans that offer choices for whatever stage in life our employees are in and allow for flexibility as life happens. Employees have access to a fully comprehensive benefits package to choose the medical dental and vision insurance plan that best fits their addition to great healthcare coverage we also offer all employees mental health resources and additional wellness programs.
- Generous time off and leave allowances
- And more!
Ollion is an equal opportunity employer. We celebrate diversity and we are committed to creating an inclusive environment for all employees. Ollion does not discriminate in employment on the basis of race color religion sex (including pregnancy and gender identity) national origin political affiliation sexual orientation marital status disability genetic information age membership in an employee organization parental status military service or other non-merit factor.
Remote Work :
No
Employment Type :
Full-time
View more
View less