Designing evaluations to provide evidence to inform action in new settings

Authors: Davey C, Hargreaves J, Hassan S, Cartwright N, Humphreys M, Masset E, Prost A, Gough D, Oliver S, Bonell C

Policy and interventions should be informed by the best available evidence, but evaluations are not always optimally designed to inform decisions about policies and interventions in new contexts. Learning the most possible from evaluations is important; evaluating is expensive and policy makers should be confident about their decisions. Using evidence from previous studies can lead to better policy decisions, but there have been cases where doing so has led to interventions that have not worked. Learning from evaluations for decisions elsewhere has generally been more successful for interventions that are simple and are less context dependent (or context-dependent in a simple way, such as depending on the severity of the problem). With increasing focus on complex, context-dependent interventions, we need to ensure that evaluations can offer as much information as possible to guide decisions in other contexts. Consultation with DIFD to inform this paper underscored the points above.

Examples where DIFD wants to learn more include:

  • What has been learned from the recent outbreak of Ebola in West Africa that could inform future outbreaks, outbreaks of other diseases, or more generally about how health promotion can be reconciled quickly with cultural norms and expectations (such as to attend funerals and lay hands on deceased relatives)? 
  • What can be learned from the peace-process in Northern Ireland that could be applicable in South Sudan? 
  • What can be learned across evaluations of programmes that use mobile phone technology to change behaviours, both for future mobile-based interventions but also as a platform for understanding how habits can be changed efficiently?
  • Large-scale, multi-component initiatives to improve the education system in a single country — what can the evaluation say about efforts to improve educational outcomes in other countries, and for engaging with public/private organisational cultures to affect change?

    The aim of this paper is to suggest possible ways to address the issue of learning more from evaluations and make recommendations for how CEDIL could advance this area in the programme of work. To achieve this aim, we conducted consultations with experts from a range of disciplines to identify key concepts and developed a framework for possible approaches. We summarised and contrasted the approaches and reflected on their potential to address DFID’s needs.

Download Designing evaluations to provide evidence

 

Leave a Reply

Your email address will not be published. Required fields are marked *