Finance and Supply Chain Senior Data Engineer

WVI

Not Interested
Bookmark
Report This Job

profile Job Location:

Pasig City - Philippines

profile Monthly Salary: Not Disclosed
Posted on: 10 hours ago
Vacancies: 1 Vacancy

Job Summary

With 75 years of experience our focus is on helping the most vulnerable children overcome poverty and experience fullness of life. We help children of all backgrounds even in the most dangerous places inspired by our Christian faith.

Come join our 33000 staff working in nearly 100 countries and share the joy of transforming vulnerable childrens life stories!

Key Responsibilities:

PURPOSE OF POSITION:
This position is responsible for design development and operationalization of data ingestion transformation and loading processes across multiple data sources within Finance Data Warehouse (DW)and Data Lake platforms. The position requires strong understanding of business and analytical requirements along with the capability to design and develop Finance datawarehouse data models ETL/ELT pipelines and data governance processes that ensure data quality security and scalability of data solutions.

QUALIFICATIONS:

Required Education training license registration and/or Certification

  • Candidate must possess a Bachelors degree in IT Computer Science or related course/field.

Required Professional Experience

  • At least 1-year professional experience as a Data Engineer working with Microsoft Fabric specifically in data lake and data warehouse implementations.

  • 5-7 years of overall experience in ETL/ETL development and including hands-on design development and support of and data warehouse/data lake implementation.

  • Strong understanding of data warehouse concepts and data modelling including dimensional modelling medallion architecture (Bronze-Silver-Gold) and end to end (ETL/ELT) pipeline design.

  • Proven hands-on experience in building and maintaining scalable data pipelines (ETL/ELT) using Apache Spark or cloud-native services (e.g Azure Data Factory).

  • Proficiency in Python Apache Spark SQL Cloud Native data integration tools with solid working knowledge of Microsoft Fabric components (Lakehouse Warehouse Dataflows Gen2 Pipeline Notebooks).

  • Experience in providing technical leadership including mentoring junior developers or leading a small development team in delivering data solutions.

  • Demonstrated ability to collaborate effectively with cross-functional teams including data scientists analysts and software engineers to deliver end-to-end data solutions.

  • Experience implementing data quality validation and governance practices to ensure data integrity and compliance with organizational and regulatory and security standards. Ability to apply AI skills

KEY RESPONSIBILITIES:

% of time

20% - Data Architecture and Design

Design and implement scalable ETL/ELT pipelines for Finance Datawarehouse Data lakes and system integration processes. Design and optimize Finance DW data models (e.g. star schema snowflake and semantic layer) aligned with business and analytical needs and costs optimizations. Collaborate with GFS Stakeholders IT Cloud Architect and Security teams to ensure alignment with enterprise and GFS architecture and security policies.

35% - Design Develop and Deploy ETL/ELT Pipelines

Work closely with Finance Business Users/SMEs Business and Data Analysts to gather and translate the business requirements into technical specifications and data solutions. Design Develop and Deploy medium to complex ETL/ELT pipelines for data ingestion transformation and loading. Implement automation for data workflows job orchestration and dependency management either batch or real time data workflows. Design and performs tuning partitioning and archiving strategies Participate in sprint ceremonies release planning development effort assessment and documentation creation and reviews.

15% - Platform Operations and Support

Provide Level 2/3 support for Finance DW pipelines which entails performing root cause analysis and code fix/enhancement

across data pipelines storage layers and downstream integration/analytics requirements.

Perform tuning of ETL/ELT workloads and database/query performance.

10% - Data Quality and Data Governance

Collaborate with Finance SMEs Data Stewards and Data Quality Analyst to define and implement business rules data validation data integration rules and data quality/anomaly detection. Development of data validation cleansing to ensure data accuracy and consistency. Ensure compliance with data governance standards and maintain metadata data lineage and documentation.

10%

Leadership & Mentoring

Mentors Data Engineers and leads code reviews and quality assurance to ensure alignment with coding standards data architecture principles and best practices in delivering secure efficient and high-performing data solutions. Leads knowledge base and share best practices in Data Engineering.

10%

Training and Others

Completion of technical and soft skills learning path and conduct knowledge sharing sessions. Attend and participate in meetings conferences workshops chapel services devotions etc. Perform other additional tasks that may be assigned.

Applicant Types Accepted:

Local Applicants Only

Required Experience:

Senior IC

With 75 years of experience our focus is on helping the most vulnerable children overcome poverty and experience fullness of life. We help children of all backgrounds even in the most dangerous places inspired by our Christian faith.Come join our 33000 staff working in nearly 100 countries and share...
View more view more

Key Skills

  • Fitness
  • AXA
  • Council
  • Cleaning
  • Data Analysis

About Company

Company Logo

World Vision International, Christian relief and development organisation, dedicated to helping the most vulnerable children overcome poverty and experience fullness of life.

View Profile View Profile