So, you’ve just finalized the evaluation report for the most recently completed project. You just approved the final draft of the report, and you and your staff are looking forward to not having an evaluator ‘all up in your business.’
But what will you do with the evaluation? How will the lessons, challenges, and opportunities in the report be learned, institutionalized and operationalized in future programming?
Evaluation utilization is something many individuals and organizations struggle with. For one, we are all busy, and because of this we are hesitant to devote time to something that does not have a deadline attached to it. Underlying this issue however is something deeper and more profound: the degree to which we value learning in our professional—perhaps even vocational?—lives, and the presence of supporting structures at the individual, organizational and field-wide levels.
But before diving in, what is evaluation utilization?
Evaluation utilization is the process by which the findings of an evaluation are shared, learned and institutionalized.
It may occur at the individual, organizational or field-wide levels.
Michael Quinn Patton, one of the most well-known and influential evaluators in the world, has written extensively about evaluation utilization and the ways in which evaluators can increase the utilization of their work by clients.
Hot Resource! Check out Evaluation Utilization Checklist by Michael Quinn Patton
While it is important for the evaluator to take on the challenge of constructing a utilization-focused evaluation design, it is equally important for the evaluand (the commissioner of the evaluation) to develop and devote time to utilization and learning processes.
Structures for Internal Learning
There is a wide range of ways in which an organization can learn from its evaluation reports. Cheyanne Church and Mark Rogers suggest assigning a formal Learning Facilitator role to the individual managing the evaluation.1 That person would then be responsible for developing a facilitated process to ensure that the learning and recommendations encapsulated in the evaluation report are acted upon.
Hot Resource! Check out Designing for Results: Integrating Monitoring and Evaluation in Conflict Transformation Activities by Cheyanne Church and Mark Rogers, Chapters 8-10.
Such a process might include creating a ‘learning group’ composed to key individuals relating to that project and its evaluation implications, and/or the creation of a ‘learning document’ that can be widely distributed throughout the organization—or, even better, distributed throughout the field of peacebuilding.
Hot Resource! Check out Reflective Peacebuilding: A Planning, Monitoring and Learning Toolkit by John Paul Lederach, Reina Neufeldt and Hal Culbertson for Catholic Relief Services, pages 67-70 for more on learning documents.
Many organizations have processes to cover internal learning. The field of peacebuilding as a whole is still figuring out—one might even say learning—how to responsibly share evaluations and their findings publically, but more specifically with other like-minded organizations and individuals.
Pushing the Field of Peacebuilding Forward
It would seem that much of the learning that takes place in peacebuilding occurs either at the individual or organizational level. With a few exceptions, there are by-and-large few processes and/or structures (real or imagined) to facilitate field-wide learning and to drive peacebuilding’s methodological discourse, including M&E, forward.
But things are changing.
There is an increasing realization that there is a real need to share evaluation reports, and to collectively learn, as a community of peacebuilders, what is effective, what isn’t, and under what conditions. Projects at the Alliance for Peacebuilding such as the Peacebuilding Evaluation Project and the Women’s Empowerment Demonstration Project evidence this fact. Indeed, the Learning Portal for DM&E for Peacebuilding itself is founded on the premise that there is indeed a need to share and collectively learn from evaluation reports and findings.
One of the strongest findings of the Peacebuilding Evaluation Project is the need to shift the culture of peacebuilding evaluation towards one that is transparent and which embraces open inquiry and shared learning. The evidence points to increasing mass and momentum towards such a shift.
The question is no longer if we as a global community of peacebuilders will share and collectively learn from our experiences in a collaborative and transparent spirit, but when.
Hot Resource! Evaluation Design Checklist by Daniel Stufflebeam
Hot Resource! Evaluation Needs Assessment by Johanna Morariu
Hot Resource! Evaluation Utilization Checklist by Michael Quinn Patton
Hot Resource! Designing for Results: Integrating Monitoring and Evaluation in Conflict Transformation Activities by Cheyanne Church and Mark Rogers, Chapters 8-10.
Hot Resource! Reflective Peacebuilding: A Planning, Monitoring and Learning Toolkit by John Paul Lederach, Reina Neufeldt and Hal Culbertson for Catholic Relief Services