Colloquium For Systematic Reviews In International Development BRAC CDM Savar, Dhaka, 10th-14th December 2012 Systematic Reviews in International Development: The Way Forward International Initiative for Impact Evaluation [3ie]
Evidence for International Development Effectiveness: Identifying effective interventions against a counterfactual Efficiency: Evidence of cost-effectiveness and cost-benefit of interventions Diversity: For whom does the intervention work/not work? Implementation: How to make the intervention work in different contexts? Experiential: Citizens experiences of policies, programmes, services
Developing the Theory of Change Most international development initiatives: Are complex (more than a single input) Operate at multiple levels (community, institutional, personal network, family, and individual) In different contexts (geographical, regional, economic, ethnic, cultural, religious, etc) Involve multiples stakeholders (government (national, provincial, local), NGOs, private, voluntary, urban/rural) Have competing theories of change Require sophisticated ToCs that reflect this complexity and multiple causal pathways
National Food Hygiene Rating System Theory of Change (General)
National Food Hygiene Rating System Theory of Change (Local Authorities)
dy p yg Standards ess pp g yg yq Agency National Food Hygiene Rating System Theory of Change (Businesses) Oxford Evidentia - Making Evidence Accessible
National Food Hygiene Rating System Theory of Change (Consumers) Oxford Evidentia - Making Evidence Accessible
The Way Forward - Demand Side Demand for SRs by policy makers and development practitioners is at best embryonic Still a culture of opinion based policy Or a selective approach to evidence Lack of a strategic approach to policy/practice and evidence Physical access to research evidence (Ouimet, 2009) Cognitive access to research evidence (Ouimet, 2009) Lack of research knowledge infrastructures (RKIs)
The Way Forward Demand Side Organisational restraints/resistance to external evidence Other sources of evidence preferred (Campbell et al 2007) Lack of trust between civil servants and researchers (Lavis et al, 2000, 2005) Communication gaps between civil servants and researchers (Lavis et al, 2000, 2005) Different time horizons (Ouimet, et al 2009)
Where Do UK Civil Servants Go For Evidence? Sharks Academic/Evaluation Research? (Source: Campbell et al 2007:22) Plankton
UK Policymakers Views of Research Evidence Too: Long Verbose Detailed Dense Impenetrable Too much jargon Methodological Untimely Irrelevant for Policy
The Way Forward - Supply Side Systematic Reviews are (often) too: Long Verbose Detailed Dense Impenetrable Too much jargon Methodological Untimely Irrelevant for Policy
The Way Forward - Supply Side Lack of research knowledge infrastructures (RKIs) In LMICs, a lack of: Scientific databases Full text downloads (free of charge) Information scientists Adequate broadband access/capacity Experience and expertise in SR methodologies Training in SR methodologies
The Way Forward - Supply Side Lack of trust between researchers and civil servants (Lavis et al, 2000, 2005) Communication gaps between researchers and civil servants (Lavis et al, 2000, 2005) Lack of access to the interpersonal networks of policy makers (Greenhalgh T, et al, 2004, 2005) Different time horizons
Different Notions of Evidence Policy Makers Evidence Knowledge Transfer Researchers Evidence Colloquial (Narrative) Anything that seems reasonable Policy relevant Timely Clear Message Scientific (Generalisable) Proven empirically Theoretically driven As long as it takes Caveats and qualifications Source: J. Lomas et al, 2005
What Users of Evidence Would Like From Research Presentations Graded entry to presentations of research (1:3:25) Well written summaries, with a clear message Indications of relevance for decision making But not specific recommendations Contextual factors that affect local applicability (external validity is as important, if not more important, than internal validity) Information about the benefits, harms/risks and costs of interventions Messages that are simple and unclouded by jargon Aligned to decision making timescales Sources: Petticrew et al, 2004; Lavis et al, 2005; Dobbins et al, 2007; Rosenbaum, 2010
How Evidence Influences Policy Instrumental Use Involves acting on research results in specific, direct ways. Conceptual Use Involves using research results for general enlightenment; results influence actions, but in less specific, more indirect ways than in instrumental use Symbolic Use Involves using research results to legitimate and sustain pre-determined positions. Source:Weiss, C, 1980
Rarely does research supply an answer that policy actors employ to solve a policy problem. Rather, research provides a background of data, empirical generalisations, and ideas that affect the way that policy makers think about a problem. Ideas from research are picked up in diverse ways and percolate through to office holders in many offices that deal with the issues. Source: Weiss, C., 1980, Policy research in the context of diffuse decision making, Journal of Higher Education, 53, 6, 619-639
How Policy Gets Made Decision making is a process, not an event Seeing policy making as a rational process fails to do justice to the ethereal nature of that diffuse, haphazard, and somewhat volatile process called decision making. The unit of research transfer should rarely be the single study but should, rather, be the summary and synthesis of knowledge across the entire spectrum of stages in the process. * Lomas, J., 2000, Canadian Journal of Policy Research, Spring, 140-144.
Summary International development needs a broader range of evidence than what works There are demand and supply side challenges for SRs in international development These are surmountable (3ie, C2, DfID, NGOs, etc) Need to develop research knowledge infrastructures And appreciate the indirect and diffuse influence of evidence on policy and practice Time to seize the opportunity!
Thank you Email: pdavies@3ieimpact.org +44 (0)207 958 8350 Visit