Design, Monitoring and Evaluation for Peacebuilding

You are here

Resource Materials have a Role in Building Evaluation Capacity

In June we conducted a survey to assess the utility of design, monitoring and evaluation (DM&E) resource handbooks/manuals and in particular the utility of Designing for Results, a manual for integrating monitoring and evaluation into conflict transformation programs.  We would like to informally feedback to the peacebuilding DM&E community a few of the general findings emerging from that effort.

  • The bulk of the capacity building done by those responding to the survey centers on learning from peers and learning by doing.  Less than 20% reported using formal education to build evaluation capacity.  This suggests that relatively few people are learning from the field of evaluation or from evaluations in other fields.  It also raises the potential of bad practices being spread unintentionally.
  • Greater interest was expressed in learning more about monitoring than evaluation, particularly monitoring in fluid and changing contexts.  We hypothesize that this may be due to the fact that monitoring is often seen to be more of an inescapable and more routine internal function than evaluation.
  • Although some still want a one-size-fits-all, paint by number set of M&E instructions, most respondents appreciate non-prescriptive guidance that allows them to tailor M&E choices to meet their program management and learning needs.
  • 18 out of 29, or 62% of the development organizations that responded to the survey did not self-identify as practitioners of conflict sensitive development.  This suggests there remains a lot of work to do in promoting conflict sensitivity.

Findings specific to Design for Results included: 

  • The online accessibility greatly contributed to the utility of Designing for Results. Its four most useful elements were reported to be its practioner focus, free on-line availability, peacebuilding focus, and its examples.
  • The front-end or basic components of Designing for Results were deemed by respondents to be the most useful. The four most useful chapters of DfR were; understanding change, indicators, program design, and evaluation preparation.

The big take away is that there is still considerable need for high-quality, pragmatic and practical guidance on with actual examples of basic peacebuilding evaluation designs, processes, tools and methods. There were also other more nuanced findings and recommendations about how resource materials can best address these needs.  For example, 57% said that they would use a companion workbook offering exercises to deepen competency. 

We’d like to express our gratitude to all those who responded to the survey and particularly to those who labored through all its many questions.  A special thanks to Margot Isman and Meghan Mahoney for their work on this initiative.

Cheyanne Sharbatke-Church and Mark M. Rogers