LEARNING FROM PRACTICE: DEVELOPMENTAL EVALUATION PRACTICES: TIPS, TOOLS, AND TEMPLATES
USAID, Social Impact, William Davidson Institute, & Search for Common Ground
Tools and Templates
This document was produced through the DEPA-MERL project; DEPA stands for “Developmental Evaluation Pilot Activity,” and is funded through U.S. Global Development Lab’s Monitoring, Evaluation, Research, and Learning Innovations program at USAID. The developmental evaluation practices in this document were created to evaluate innovative programs that operate in complex environments and are thus expected to adapt over time.
Recognizing that much of the available Developmental Evaluation in practice resources are based on theory rather than practical experience, this “Tips, Tools, and Templates” document captures early lessons learned from our experience implementing two pilots, and offers guidance for organizations, managers and evaluators seeking to implement the Developmental Evaluation approach.
Conducting a developmental evaluation in practice usually entails a substantial resource investment, so it is important to set the stage for success. This document includes guidance for developmental evaluation managers to preempt common pitfalls.
Interest in developmental evaluation is increasing among many evaluators and non-evaluators alike. However, it is not right for all contexts. While developmental evaluation originated to serve complex, innovative programs, it can only do so successfully when the organizational context is appropriate. To determine whether this is the case, consider using resources such as the Spark Policy Institute’s Developmental Evaluation
Readiness Assessment tool. Assess whether the contracting mechanism, organizational culture, personalities, and program scope are amenable to adaptation or whether another evaluation approach is more suitable. If
interested organizations are not sufficiently “ready,” the developmental evaluation may ultimately fail to serve its intended purpose(s). Be open to the fact that developmental evaluation may not be the right fit, and be ready
to propose alternative evaluation approaches.
The DEPA-MERL consortium consists of Social Impact (prime), Search for
Common Ground, and The William Davidson Institute at the University of Michigan. The DEPA-MERL consortium has documented early lessons learned from its experience and is pleased to offer guidance for organizations, managers, and evaluators that seek to implement this approach.
Consortium Contact: Gabrielle Plotkin, firstname.lastname@example.org
Lab Contact: Shannon Griswold, email@example.com