Closing the Learning Loop – How to extend the ownership of evaluation findings to project beneficiaries? Y Care International
Original blog by Hur Hassain published on YCareinternational.org
Our Impact Results and Learning Manager, Hur, tell us about his recent experience sharing the findings of a final evaluation with beneficiaries. The project evaluated is “Improving financial resilience and promoting gender equity of disadvantaged young women in marginalised communities of Umerkot, Pakistan”.
The monitoring and evaluation field of humanitarian aid has seen advancements from traditional ways to new methods, such as using drones for collecting data. However, we have not yet answered the key questions: who is really benefiting from the learning results and who is accountable? I have seen many evaluation wrap-up meetings happening in capital cities, mainly targeted at public figures/leaders, but this was the first time I saw evaluators sharing findings with the people who had benefitted from the project and who will support the scaling up of that learning for deeper and long-lasting impact on the ground.
Since the project was aimed at making people change-agents rather than just recipients of aid, the evaluation design purposely included a validation workshop to share the findings with communities, so they could learn from it and understand the importance of monitoring and evaluation to provide solid evidence of project results. It was specifically important as we wanted to scale up the socio-economic achievements of the project to all communities e.g. 94% selected households started living above the poverty line of $2 a day compared to only 3% households before the project. At the end of the meeting, community members were aware of what worked and what didn’t in the project, as well as the best possible ways for future improvements. One participant said, ‘this validation workshop was special, since it was the first time after a survey that the evaluation team shared the results with us’.
Below are eight simple steps and some lessons we learnt from this experience, which could be replicated in other contexts:
1) Budgeting validation workshops in communities:
Let us start with organisation and budgeting. If you do not have the money you cannot organise a field-level evaluation workshop. I am not referring to big budgets, just enough set aside in the design of the project to pay the travel of an evaluator and the logistics. A validation workshop is much more useful for beneficiaries than a 60-page evaluation report: the usefulness of a report is limited as they are usually not translated into local languages and seldom reach the project target groups.
2) Designing validation workshop:
The starting point is preparation of the workshop agenda. Gather the necessary tools, prepare sessions according to the key evaluation questions and adapt them according to the participants’ differences and needs (e.g., culture, language etc.). To ensure active participation we used energisers, ice-breakers and team building techniques. We assigned note-takers and facilitators in advance.
3) Identification of venue:
We gathered participants from eight villages in two locations in Umerkot and invited them to one workshop with the frontline staff of the project. When selecting which village would host the meeting we considered the availability of local transport from neighbouring villages, availability of a meeting space as well as interest and willingness of the community leaders.
4) Selection of evaluation results to be presented:
People are busy in remote and poor rural villages; therefore, you have a time constraint to address. Select your most relevant findings and present them concisely. We converted the data into reader friendly graphs and designed simple questions to allow easy understanding of the findings. We observed that people had fun and got engaged. The community voices helped the evaluators to further make sense of the data.
5) Involve women, young people and the poorest:
We encouraged women, young people and the poorest in the community to participate in the workshop so that
“no one is left behind”…. As a result, we found that more women participated than men and they were active in the dialogues on the best way forward.
6) Speak their own language:
To overcome the language barrier and to engage everyone in the discussion, the workshop was organised in the local language, Sindhi. The lead organiser of the workshop also spoke the local language.
7) Use appropriate evaluation methods and technology: OUTCOME HARVESTING and SPROKLER.
The evaluation used Outcome Harvesting and Sprockler, a method and a mobile data collection tool. The harvested outcomes were shared with the beneficiaries by the facilitators and the use of Sprockler provided immediate and ready to use information. The graphs were made by small colour coded dots that represents individuals and their most significant change stories.
8) Exploring impact causalities with beneficiaries: what worked and what didn’t:
To make sense of the data gathered during the workshop sessions, we also asked the participants to comment on the results of the evaluation to better understand the causalities of impact. For instance, women from non-Muslim communities (which are the poorest) reported a change in how they are perceived in the community because of the project; they have more resources, better control over them and increased mobility which in turn allowed better access to the markets and to productive resources. Women have also been viewed as role models in the villages.
Moreover, the workshop provided the opportunity to share lessons learnt from the project with the wider community, including non-beneficiaries. In the case of economic resilience for example, it was beneficial for those who weren’t doing well to see the reasons why others were succeeding in improving their livelihoods and/or any areas for improvement.
In conclusion, a validation workshop in the field with beneficiaries allows us to close the learning loop, by extending the ownership of evaluation findings, thus empowering communities and building trust with the implementation agencies. It draws attention to the importance of establishing solid Monitoring & Evaluation systems and on the learning those systems generate, which is key to inform the design of future development interventions towards better and deeper impact on the ground.
More info:
- View the original post here
- Visit the website of Y Care International
- Contact Nele