Data Platform DevOps Engineer
Job Summary
This is a remote position.
Role Overview
Organizational Context
The International Federation of Red Cross and Red Crescent Societies (IFRC) is the worlds largest humanitarian organization operating through a network of 191 National Societies. IFRC delivers humanitarian assistance across disasters health emergencies and crises globally.
The Digital Transformation Department (DTD) leads the organizations digital strategy enabling innovation digital services and data-driven decision-making across the IFRC network.
The AI & Data Unit is responsible for:
Data platform management
Data governance and strategy
AI enablement and analytics
Data product lifecycle management
Role Purpose
The Data Platform DevOps Engineer is responsible for designing implementing and managing IFRCs enterprise data platform using Microsoft Fabric and Azure ecosystem.
This role combines DevOps platform engineering and cloud infrastructure expertise to ensure a secure scalable and high-performing data platform supporting global humanitarian operations.
Key Responsibilities
1. Platform Engineering & Architecture
Design build and maintain Microsoft Fabric platform components (OneLake Lakehouse Data Warehouse Data Factory Power BI Real-Time Intelligence)
Architect scalable multi-region data platform solutions
Develop Infrastructure as Code (IaC) using Terraform ARM templates or similar tools
Optimize OneLake storage structures shortcuts mirroring and data lake performance
Support workloads across data engineering analytics and business intelligence
2. CI/CD & Deployment Automation
Build and manage CI/CD pipelines using Azure DevOps and Fabric deployment pipelines
Implement automated deployment strategies using Fabric REST APIs and Git integration
Define branching strategies and environment workflows (Dev Test Prod)
Automate provisioning configuration and deployment of platform components
Manage environment-specific configurations and deployment rules
3. Platform Operations & Maintenance
Monitor platform performance health and resource utilization
Implement observability frameworks using tools like Azure Monitor Prometheus Grafana
Manage capacity planning cost optimization and resource allocation
Perform platform upgrades patching and lifecycle management
Ensure disaster recovery backup and business continuity readiness
4. Security Governance & Compliance
Implement security and governance using Microsoft Purview
Configure RBAC row/column-level security and access controls
Manage Microsoft Entra ID service principals and managed identities
Enforce data protection policies (DLP sensitivity labels encryption)
Ensure compliance with global standards (e.g. GDPR HIPAA ISO)
Implement network security (private endpoints encryption keys secure data sharing)
Monitor and respond to security incidents and vulnerabilities
5. Automation & Scripting
Develop scripts using Python PowerShell Bash Azure CLI
Automate pipeline orchestration monitoring and incident response
Build internal tools to improve developer productivity and platform usability
Enable self-service capabilities while maintaining governance
6. Collaboration & Support
Work with data engineers analysts and data scientists to optimize platform usage
Provide technical guidance on Fabric pipelines and best practices
Collaborate with security and compliance teams
Support incident management and root cause analysis
Promote DevOps culture and continuous improvement
7. Documentation & Knowledge Management
Maintain technical documentation runbooks and SOPs
Document architecture deployment processes and governance frameworks
Track platform inventory dependencies and integrations
Support audit and compliance documentation requirements
Qualifications
Education
Bachelors or Masters degree in Computer Science IT Data Science or related field
Certifications (Preferred)
Microsoft Azure (Administrator / Data Engineer / Solutions Architect)
Microsoft Fabric and Power BI certifications
Experience
5 years in DevOps Platform Engineering or SRE roles
3 years of hands-on experience with Microsoft Azure and Azure DevOps
Strong experience with Microsoft Fabric or related platforms (Power BI Synapse Data Factory)
Expertise in Infrastructure as Code (Terraform ARM Ansible)
Strong scripting skills (Python PowerShell Bash SQL)
Experience with CI/CD tools (Azure DevOps GitHub Actions)
Experience with Docker Kubernetes (AKS/EKS/GKE)
Knowledge of cloud storage (Azure Storage AWS S3 GCP Cloud Storage) and databases
Experience in data lakes data warehousing and ETL/ELT pipelines
Hands-on experience with monitoring tools (Azure Monitor Prometheus Grafana ELK)
Experience with Microsoft Fabric workspaces Git integration and deployment pipelines
Understanding of OneLake architecture and Fabric APIs
Experience with multi-region or multi-cloud environments (preferred)
Experience in humanitarian/non-profit sector (preferred)
Technical Skills
Data lakehouse architecture medallion model
Delta Lake Parquet and modern data formats
Data pipeline orchestration and automation
Cloud security (IAM encryption network security)
Microsoft Entra ID and identity management
Monitoring observability and performance optimization
Real-time and event-driven data processing
MLOps and AI integration (preferred)
Core Competencies
Strong problem-solving and analytical skills
Effective communication with technical and non-technical stakeholders
Collaboration across cross-functional and global teams
Attention to detail and quality assurance
Continuous learning and adaptability
Ability to align technical solutions with business needs
Languages
Fluent English (mandatory)
Additional language (French Spanish or Arabic) preferred
Key Skills
- Apache Hive
- S3
- Hadoop
- Redshift
- Spark
- AWS
- Apache Pig
- NoSQL
- Big Data
- Data Warehouse
- Kafka
- Scala