Full Stack Developer

Next-Link

Not Interested
Bookmark
Report This Job

profile Job Location:

Budapest - Hungary

profile Monthly Salary: Not Disclosed
profile Experience Required: 6years
Posted on: 12 hours ago
Vacancies: 1 Vacancy

Job Summary

This is a remote position.

Role Overview


Job Title: Full Stack Developer
Level: D Support
Department: Data Unit Digital Transformation Department (DTD)
Location: IFRC Global Services Centre Budapest Hungary
Reporting To: Manager Data Unit



Organizational Context

The International Federation of Red Cross and Red Crescent Societies (IFRC) is the worlds largest humanitarian network comprising 191 National Societies. IFRC delivers critical humanitarian support globally focusing on disaster response health emergencies and community resilience.

The Digital Transformation Department (DTD) drives digital innovation and data transformation across the organization enabling better decision-making through modern data platforms and analytics.


Role Purpose

The Full Stack Developer is responsible for delivering end-to-end data solutions including data engineering data modeling and analytics/reporting.

The role focuses on developing scalable data products on Microsoft Fabric and Azure Synapse supporting IFRCs operations across multiple domains such as Finance Logistics HR and Project Management.


Key Responsibilities

1. Data Analytics & Reporting

  • Analyze business processes data models and reporting requirements

  • Collaborate with business stakeholders to gather and refine requirements

  • Design and develop Power BI dashboards and reports

  • Ensure data accuracy completeness and consistency in reporting

  • Support UAT cycles and resolve identified issues

  • Publish and maintain reports in production environments


2. Data Engineering & Development

  • Design and implement data models (Data Vault and Dimensional modeling)

  • Develop data pipelines using SQL and PySpark

  • Build and maintain data solutions on Microsoft Fabric and Azure Synapse

  • Implement data quality validation and monitoring processes

  • Optimize performance of queries and data pipelines

  • Ensure compliance with data security and privacy standards

  • Document data flows architecture and engineering processes


3. Platform & Technical Support

  • Work with data lakes lakehouse architectures and big data platforms

  • Support ETL/ELT processes and data pipeline orchestration

  • Troubleshoot performance issues and optimize workflows

  • Ensure scalable and efficient data processing solutions


4. Collaboration & Stakeholder Engagement

  • Work closely with architects data engineers analysts and business teams

  • Support cross-functional global projects

  • Provide guidance on data best practices and tools

  • Promote adoption of self-service analytics across teams


5. General Responsibilities

  • Support the Data Unit Manager with progress tracking and reporting

  • Contribute to continuous improvement of data processes and tools

  • Build partnerships across global teams and regions


Qualifications

Education

  • Bachelors or Masters degree in Computer Science Information Systems or related field


Experience

  • 5 years of experience in data engineering and development

  • 3 years of hands-on experience with Microsoft Fabric / Azure Synapse and Power BI

  • Strong experience in data modeling (Data Vault & Dimensional modeling)

  • Experience with data lakes lakehouse architectures and big data platforms

  • Proficiency in Python SQL or Java

  • Hands-on experience with Apache Spark

  • Experience with Azure Data Factory and Microsoft data stack

  • Familiarity with CI/CD pipelines (Azure DevOps preferred)

  • Experience working with large-scale data pipelines and datasets

  • Understanding of data governance security and compliance (GDPR etc.)

  • Experience in humanitarian/non-profit sector (preferred)


Technical Skills

  • Microsoft Fabric Azure Synapse Azure Data Factory

  • Power BI (including Paginated/SSRS reports)

  • Data modeling (Data Vault Dimensional modeling)

  • SQL Python PySpark

  • Big data technologies (Spark Hadoop)

  • Data lakes and lakehouse architectures

  • ETL/ELT development

  • Cloud platforms (Azure preferred; AWS/GCP knowledge is a plus)

  • Data security and governance fundamentals


Core Competencies

  • Strong analytical and problem-solving skills

  • Effective communication and collaboration

  • Ability to work in cross-functional and global teams

  • Attention to detail and quality focus

  • Proactive and solution-oriented mindset

  • Adaptability in dynamic environments


Languages

  • Fluent English (mandatory)

  • Additional language (French Spanish or Arabic) preferred


Competencies Framework

Values:

  • Integrity

  • Professionalism

  • Accountability

  • Respect for diversity

Core Competencies:

  • Communication

  • Teamwork and collaboration

  • Decision-making and judgment

  • Creativity and innovation

  • Building trust

Functional Competencies:

  • Strategic thinking

  • Relationship building

  • Learning agility

  • Execution excellence




Required Skills:

5 years of progressively responsible experience in data engineering with a focus on data modelling and 3 years in data engineering (Microsoft Fabric or Azure Synapse) and report development (Power BI). Proven track record as a Data Engineer or similar role and in Power BI development including Paginated or SSRS reports. X Core Expertise in the Microsoft Data Stack in particular proficiency with Fabric Azure Data Factory Azure Synapse Analytics. X Experience in Data Lake and Data Lakehouse implementation (e.g. Microsoft Fabric Azure Synapse Databricks Snowflake Microsoft SQL Server Apache Spark/Hadoop or other similar big data or SQL databases). X Experience in Data Vault and dimensional data modeling techniques X Proficiency in programming languages such as Python SQL or Java. X Proficiency in the Apache Spark framework. X Knowledge of Finance/Accounting logic in ERP ideally with D365 F&O. Understanding humanitarian accounting logic is a benefit (appeals pledges projects etc.) X Experience in data governance architecture and handling large datasets and data pipelines. X Strong knowledge of Azure Cloud architecture and networking principles. X Familiarity with CI/CD pipelines for data workflows (e.g. using Azure DevOps). X Proficiency in cloud platforms and technologies such as AWS Azure or Google Cloud. X Experience with big data technologies such as Hadoop Spark and distributed storage systems. X Familiarity with data governance data security and data privacy regulations (e.g. GDPR CCPA). X Experience within the RC/RC Movement and/ or international humanitarian or development organisations.

This is a remote position.Role Overview Job Title: Full Stack Developer Level: D Support Department: Data Unit Digital Transformation Department (DTD) Location: IFRC Global Services Centre Budapest Hungary Reporting To: Manager Data Unit Organizational ContextThe International Federat...
View more view more

Key Skills

  • Abinitio
  • Administration And Accounting
  • Android
  • Bid Management
  • Inventory Management
  • Embedded C