yellow shape
[EN] Innovation Hub Blog

Recap | SP Lab Mid-Term Review sessions

Although all Strategic Partners must conduct a Mid-Term review, accents and choices may differ. Do you opt for an accountability or learning-focused evaluation? Or perhaps a combination of the two? And what implications does this have on, for instance, working with your consultants? Furthermore, do your preferred methods for data collection correspond with the IOB criteria? And lastly, are we ready to implement recommendations, even if it would require radical adjustments in further programming and planning? 

Well, these questions are certainly food for thought. But luckily, together with a group of committed PMEL offers, we tried to tackle all these questions during four different sessions of our Strategic Partnership Lab. This blog contains a short recap of all the sessions and some valuable resources you might want to check out.

21 april 2023

Session 1: Dilemmas in working with the IOB guidelines

In the first session we have tackled several questions related to working with the IOB guidelines, such as: 

  • Do you work with internal or external evaluators? 
  • What are approaches to reduce biases of qualitative result and do these approaches correspond with the methods approved by the IOB? 
  • What is the role of outcome harvesting during the MTR? 

“We ourselves do see the risks of only relying only on our own outcomes (…) So we triangulate through for example contribution analysis and additional indicator measurements so there are more sources around a certain outcome (…) Yes you need to strengthen perhaps your overall method with additional methods and not rely only on outcome harvesting.”

Are you interested in learning more about what PMEL officers have to say regarding these challenges? Check out the following sources: 

Session 2: Dilemmas in working with consultants (for accountability and learning)

During this session, we have exchanged knowledge on: 

  • How to organise the selection process of national and global consultants? 
  • How will you assure that consultants will deliver evaluating results for accountability 
  • How will you assure that consultants will help the programme partners to learn and improve? 

“Our programme mostly focusses on youth and giving youth the voice to be out there to exercise their rights. We really wanted it to be a learning kind of evaluation, involving the youth a lot (…) To ensure that everyone participated, because we wanted it to be all inclusive, we had the dilemma of keeping it (the evaluation) internally or having it (the evaluation) externally. After several consultations, we agreed on having it externally, but ensuring that internally programme staff can participate, so they are able to learn.”

Check out the following links to learn what other Strategic Partners have to say:

Session 3: Dilemmas in data collection

During the third session on the MTRs, we discussed questions such as 

  • How can we make data that is collected via Outcome Harvesting workable for consultants, while at the same time being critical in terms of ‘marking your own homework’? 
  • What is the best approach to collect data in communities, without it being too time and budget-consuming?  
  • How to deal with the conflicting messages of using a sample when your partnership is active in a large number of countries,  while at the same time conducting a result measurement of MFA indicators in all countries 

“With regards to the validation of outcome harvesting, one of the ideas is to have the external consultant validate them (outcomes), but also maybe to think about the possibility of the external consultant reaching others (…) who were not part of the programme or projects but have witnessed the changes we are claiming. That is mostly to avoid the bubble feeling.  When you are in the same programme you feel we have changed this or this, while this is not that obvious or clear for someone that is not engaged in the programme or project.”

Are you curious about the session materials for this session? Check out the following sources: 

Session 4: Dilemmas in learning from the Mid-Term Reviews

During this session, Right to Grow identified the following learning questions: 

  • How do we/you plan to implement key MTR recommendations to show that change is really taking place? 
  •  How ready are we/you to manage adverse recommendations from MTR that would require radical changes/adjustments? 
  •  How do we plan to set up learning for 3rd tier partners (in general the CSOs we collaborate with without contractual arrangements)? 

“Also to remind what is our purpose in terms of learning in Right 2 Grow: it is really about developing a learning culture, to change ways of working, and to provide tools, support and leadership to create natural and comprehensive space for learning.”

Are you interested to get a better idea of how organisations try to connect the dots between country level learning and global level learning, and what they have to say on dilemmas in learning? Take a look at the following links: