Consultant Independent evaluation of the project Applied research in ecology and social sciences for sustainable management of Central Africas forest ecosystems (RESSAC, 2021–2026)
Job Summary
The evaluation will cover the full programme period (Nov 2021Nov 2026) and all four expected results. It will include both programme-level performance and a purposive sample of research consortia as case studies to examine pathways from research outputs to uptake and outcome-level change.
Geographic scope will be the COMIFAC/CEEAC region and other countries covered by RESSAC-funded research activities. The evaluation team will propose a feasible sampling plan during inception balancing country coverage with depth.
Cross-cutting dimensions that the evaluation must address include:
- Integration of biophysical (ecology) and social science: interdisciplinarity in research design field implementation analysis and translation into usable recommendations.
- Research and knowledge uptake: pathways mechanisms and evidence of use in policy processes operational decision-making and practice by key actor groups.
- Capacity strengthening: individual and institutional capacities (scientific writing project formulation research supervision) including post-doctoral and Master-level support and enabling administrative/financial capacities. The evaluation will also analyze the contribution of post-doctoral fellows to knowledge production scientific animation visibility and the initiatives success.
- Equity and inclusion: engagement of IPLC and other stakeholders in research and dissemination; gender responsiveness where relevant to the research portfolio.
Indicative key evaluation questions (organized by evaluation criteria)
The evaluation will be guided by criteria commonly used for research programme evaluations including relevance scientific quality efficiency effectiveness impact (with an emphasis on outcome-level influence) and sustainability. This is a reduced list of indicative questions (to be finalized during inception) that maintains a balanced coverage of themes and key evaluation priorities.
Relevance and coherence
- Relevance: To what extent did RESSAC address priority problems and evidence needs for sustainable management of Central Africas forest ecosystems as identified by key decision-makers and practitioners
- Logic coherence: To what extent is the programme logic (research capacity ICF uptake) coherent and plausible and which assumptions/conditions proved decisive (or fragile)
Scientific quality interdisciplinarity and knowledge production
- Scientific quality: What is the quality rigor and credibility of the research produced (biophysical and social sciences) and how is quality ensured at consortium and programme levels
- Interdisciplinarity: To what extent did RESSAC effectively promote and operationalize interdisciplinary approaches (integrated questions methods syntheses articulation across scales)
Effectiveness results and uptake / use
- Achievement of expected results: To what extent were the expected results achieved and what explains variations across consortia and countries
- Outputs and usefulness: To what extent did funded research produce useful deliverables (publications data methods tools policy briefs) and are these products accessible and fit for use
- Uptake and outcome-level change: What evidence exists of appropriation and use of RESSAC outputs by target groups and what observable outcome-level changes result (decisions practices strategies institutional processes)
- ICF / last mile: To what extent was the ICF strategy effective in moving beyond publications toward dissemination training and uptake (portal briefs events etc.)
Capacities post-docs and unexpected outcomes
- Capacities and post-docs: To what extent did the programme strengthen capacities of Central African institutions and researchers (including post-docs and Master-level) and what was the contribution of postdoctoral fellows to the visibility and success of the initiative (scientific production mentoring/scientific animation partnerships)
- Unexpected outcomes: What unexpected outcomes (positive or negative) emerged (partnerships policy windows spillovers reputation) and why
Governance efficiency and implementation learning (including MTE)
1. Governance & management: To what extent did governance and management arrangements (programme and consortia) enable timely high-quality implementation as well as effective partner involvement in knowledge co-production and use of results
2. Bottlenecks & MTE: What were the main bottlenecks (mobility/visa administrative capacities transfers reporting) how were they managed and to what extent were lessons/recommendations from the mid-term evaluation taken up
Impact sustainability and forward-looking perspectives (RESSAC 2)
1. Credible influence: What credible contribution can be established between RESSAC-supported research and observed policy/practice influence (including early signals and pathways still unfolding)
2. Sustainability & future options: How likely are results/capacities to be sustained beyond the project and what design options/strategic choices should guide a potential RESSAC 2 (with what supporting evidence)
Methodology and evaluation approach
The evaluation will use a mixed-methods approach theory-based suited to research programmes where outcomes may occur through multiple contribution pathways and time lags. The team is expected to triangulate evidence across sources and stakeholder perspectives and to be explicit about attribution/contribution limits.
Overall design
- Portfolio-level assessment of programme results governance and enabling systems.
- Contribution-focused assessment of outcome-level change and uptake pathways (e.g. outcome harvesting and/or contribution analysis) for selected cases.
- Comparative case studies of a purposive sample of consortia to examine relevance interdisciplinarity quality dissemination and uptake.
Sampling strategy (to be finalized in inception)
The evaluation team will propose a sampling strategy that is feasible and defensible balancing breadth and depth. At minimum the sample should:
- Cover a mix of thematic clusters and disciplinary profiles (ecology-heavy social science-heavy and explicitly integrated consortia).
- Include consortia at different stages (completed in 2024/2025 and those finalizing in 2026) to assess both early outcomes and emerging pathways.
- Include cases with early signals of uptake (e.g. engagement in national policy processes) as well as cases with weaker uptake to understand enabling and constraining factors.
- Ensure representation of IPLC-related themes and gender-relevant research where applicable.
Analysis and synthesis
- Develop a refined theory of change / results pathway model during inception including key assumptions and uptake pathways.
- Qualitative analysis (coding and thematic synthesis) of interview and document data.
- Quantitative descriptive analysis of portfolio indicators (e.g. outputs trainings dissemination metrics) and survey results.
- Cross-case comparison and triangulation to identify patterns explanations and actionable recommendations.
Limitations and mitigation
- The evaluation must transparently document limitations (e.g. time lags in policy influence incomplete monitoring data access constraints) and propose mitigation strategies (triangulation careful case selection explicit contribution claims).
Data availability and collection
The evaluation will draw on programme documentation and existing monitoring information complemented by primary data collection with key stakeholders.
Data sources and methods (indicative)
- Document review: project design documents annual reports logframe and monitoring data consortium final reports publications and knowledge products.
- Key informant interviews (remote and in-person): CIFOR-ICRAF team EU stakeholders research partners post-docs and students and intended users (field actors authorities CSOs etc.).
- Surveys (if relevant): short structured surveys of consortium leads/post-docs and/or selected user groups to document uptake capacity changes and perceptions of usefulness.
- Research outputs and quality review: mapping of publications and products (including basic bibliometrics where relevant) assessment of quality against defined criteria (relevance rigor credibility accessibility).
- Policy and practice tracing: structured review of uptake evidence (citations minutes participation in trainings adoption decisions) and contribution analysis/process tracing for case studies.
Data management and ethics
The evaluation team will apply informed consent procedures ensure confidentiality of interviewees and comply with applicable safeguarding and data protection requirements.
A set of key documents will be made available to the evaluation team. The team may request additional materials including consortium final reports consolidated monitoring data and evidence of uptake.
Required Skills:
The evaluation must be conducted by an independent team with no conflict of interest. As noted above the available budget allows for mobilizing a maximum of two (2) consultants. Bidders are invited to propose a team and workplan consistent with this constraint. Collectively the team should cover the following competencies: Evaluation expertise in research-for-development programmes including theory-based approaches and contribution analysis/outcome harvesting. Strong understanding of forest ecology sustainable forest management and/or landscape approaches in Central Africa. Strong applied social science expertise (governance political economy rights/IPLC incentives) and experience tracing policy influence. Experience assessing interdisciplinarity and integration of biophysical and social science. Experience in capacity development evaluation (individual and institutional). Excellent facilitation and analytical writing skills; ability to work primarily in French with English as an asset. Evaluation management and quality assurance process The evaluation will be managed by an appointed evaluation manager (CIFOR-ICRAF) and overseen by an Evaluation Reference Group including representatives of CIFOR-ICRAF the EU and selected partner institutions/users (to be confirmed). Roles and responsibilities (indicative): 1. Commissioner / Evaluation manager Manage the procurement process; provide documentation and contacts; ensure access; coordinate reviews; oversee quality assurance; facilitate dissemination and the management response 2. Evaluation Reference Group Provide strategic guidance facilitate access review the inception report and draft report and contribute to validation and learning events (without influencing findings) 3. Evaluation Team Leader Lead methodological design; ensure quality and ethics; manage the team; produce the inception report and final deliverables; present findings 4. Evaluation team members Lead thematic components (ecology/biophysical sciences social sciences/policy capacity development communication/uptake) and contribute to analysis and reporting The evaluation must comply with high ethical standards including informed consent confidentiality and do no harm. Special attention should be paid to respectful engagement with IPLC and to safeguarding considerations during fieldwork. Quality assurance measures should include: (i) an inception report review by the Evaluation Reference Group; (ii) peer review of the draft report (including scientific/technical review); and (iii) transparent documentation of methods and limitations. Bidders must declare any real or perceived conflicts of interest and describe mitigation measures. Reporting requirements Deliverables Inception report including refined evaluation questions; reconstructed theory of change; evaluation matrix; sampling strategy; data collection tools; detailed workplan and itinerary. PowerPoint presentation of preliminary findings to CIFOR-ICRAF and other members of the Evaluation Reference Group at the end of the inquiry phase. Draft evaluation report (in French; maximum 50 pages excluding annexes) submitted for review. Final evaluation report (in French; with an executive summary; maximum 50 pages excluding annexes) incorporating comments including clear findings evidence-based conclusions and prioritized practical recommendations for (i) the final year of implementation and (ii) a potential RESSAC 2. Clean annexes: evaluation matrix list of documents reviewed list of people consulted and limitations statement. Report structure language and dissemination The final report (maximum 50 pages) must be written in French and suitable for external audiences in a concise practical style with clear messages. An English executive summary (or a full English translation) may be requested depending on stakeholder needs (to be confirmed during contracting). The report should include: background; methodology; findings organized by evaluation criteria; conclusions; lessons learned; prioritized recommendations; and annexes. Minimum requirements for technical and financial proposals and proposal evaluation process Technical proposal minimum contents Understanding of the assignment and evaluation purpose including a brief analysis of likely challenges and mitigation measures. Proposed approach and methodology including theory-based elements the sampling strategy and a clear plan to assess outcome-level achievements and uptake pathways. Workplan and timeline aligned with Section 6 (Duration and phasing) including level of effort by team member and proposed travel/fieldwork (if applicable). Team composition roles and responsibilities CVs and a statement of independence and absence of conflict of interest. Quality assurance and ethics plan including data protection and safeguarding considerations. Writing sample(s) of comparable assignments (preferably research-for-development / policy influence evaluations) and at least three references. Financial proposal minimum contents Itemized budget in the requested currency/currencies distinguishing professional fees travel subsistence and other direct costs. Daily rates and number of days per team member with a separate subtotal per deliverable/phase (inception inquiry reporting). Assumptions and main cost drivers (e.g. number of field missions countries visited workshops/webinars). Any proposed in-kind contributions or cost-sharing. Proposal evaluation process (indicative) CIFOR-ICRAF will assess proposals against compliance requirements and quality criteria. A shortlist may be invited for interviews. The commissioner reserves the right to negotiate technical and financial aspects with the top-ranked bidder. Indicative evaluation criteria for proposals: Methodological quality and feasibility including a credible approach to