What I Learned from a Year Spent Studying How to Get Policymakers to Use Evidence
By zdgroff @ 2018-09-04T15:55 (+32)
Crossposted on zachgroff.com
The past year I was a senior research analyst at Northwestern University's Global Poverty Research Lab on a study of evidence-based policy. Specifically, our goal was to work on a question often on researchers' minds: how can I get my ideas acted upon?
To do this, I dug through a number of bodies of evidence on how science influences policy. One area I looked at is what is called "implementation science" in medicine, which looks at how to get doctors, nurses, and hospital administrators to adopt evidence-based practice. Another was a series of papers by social scientist Carol Weiss and her students on how policymakers in government agencies claim to use evidence. There is also a small literature on how to implement evidence-based policy in public schools, and a little work on policymaker numeracy. I've included a bibliography below that should be helpful for anyone interested in this topic.
Most of my year was spent on delving into attempts to scale up specific policies, so this literature review is not as extensive as it could be. Still, while there are no knock-down bold conclusions, the research on evidence-based policy does offer a few lessons that I think anyone trying to get others to act based on evidence could use. I think this has broad applicability to people working to help others effectively, such as:
—Anyone working to promote evidence-based policy
—Researchers and those working at research organizations who are trying to get others to listen to them or trying to figure out what research to do
—Mangers in nonprofits looking to promote the use of evidence by employees
—Advocates promoting more rational behavior (e.g. giving to effective charities or considering others' interests)
Here is what I learned:
1) Happily, evidence does seem to affect policy, but in a diffuse and indirect way. The aforementioned researcher Carol Weiss finds that large majorities (65%-89%) of policymakers report being influenced by research in their work, and roughly half of them strongly (Weiss 1980; Weiss 1977). It's rare that policymakers pick up a study and implement an intervention directly. Instead, officials gradually work evidence into their worldviews as part of a gradual process of what Weiss calls "enlightenment" (Weiss 1995). Evidence also influences policy in more political but potentially still benign ways by justifying existing policies, warning of problems, suggesting new policies or making policymakers appear self-critical (Weiss 1995; Weiss 1979; Weiss 1977).
2) There are a few methods that seem to successfully promote evidence-based policy in health care, education, and government settings where they have been tested. The top interventions are:
2a) Education—Workshops, courses, mentorship, and review processes change decision makers' behavior with regard to science in a few studies (Coburn et al. 2009; Matias 2017; Forman-Hoffman et al. 2017; Chinman et al. 2017; Hodder et al. 2017).
2b) Organizational structural changes—If an organization has evidence built into its structure, such as having a research division and hotline, encouraging and reviewing employees based on their engagement with research, and providing funding based on explicit evidence, this seems to improve the use of evidence in the organization (Coburn and Turner 2011; Coburn 2003; Coburn et al. 2009; Weiss 1980; Weiss 1995; Wilson et al. 2017; Salbach et al. 2017; Forman-Hoffman et al. 2017; Chinman et al. 2017; Hodder et al. 2017).
A few other methods for promoting research-backed policies seem promising based on a bit less evidence:
2c) Increasing awareness of evidence-based policy—Sending employees reminders or newsletters seems to increase research-based medicine based on two high-quality review papers (Murthy et al. 2012; Grimshaw et al. 2012). . Similarly, all-around advocacy campaigns to promote evidence-based practices among practitioners achieves substantial changes in one randomized controlled trial (Schneider et al. 2017).
2d) Access—Merely giving people evidence on effectiveness does not generally affect behavior, but when combined with efforts to motivate use of the evidence, providing access to research does improve evidence-based practice (Chinman et al. 2017; Wilson et al. 2017).
2e) External motivation and professional identities— Two recent RCTs and a number of reviews and qualitative research find that rewarding people for using evidence and building professional standards around using research are helpful (Chinman et al. 2017; Schneider et al. 2017;Hodder et al. 2017; Forman-Hoffman et al. 2017; Weiss et al. 2005; Weiss 1995; Wilson et al. 2017; Weiss 1980; Weiss 1977; Matias 2017; Coburn 2005; Coburn 2003).
3) Interestingly, a few methods to promote evidence-based practices that policymakers and researchers often promote do not have much support in the literature. The first is building collaboration between policymakers and researchers, and the second is creating more research in line with policymakers' needs One of the highest-quality write-ups on evidence-based policy, Langer et al. 2016 finds that collaboration only works if it is deliberately structured to build policymakers' and researchers' skills. When it comes to making research more practical for policymakers, it seems that when policymakers and researchers work together to come up with research that is more relevant to policy, it has little impact. This may be because, as noted in point (1), research seems to influence policy in important but indirect ways, so making it more direct may not help much.
4) There is surprisingly and disappointingly little research on policymakers' cognition and judgment in general. The best research is familiar to the effective altruism community from Philip Tetlock (1985; 1994; 2005; 2010; 2014; 2016) and Barbara Mellers (2015), and it gives little information on how decision-makers respond to scientific evidence, but suggests that they are not very accurate at making predictions in general. Other research indicates that extremists are particularly prone to overconfidence and oversimplification, and conservatives somewhat more prone to these errors than liberals (Ortoleva and Snowberg 2015; Blomberg and Harrington 2000; Kahan 2017; Tetlock 1984; Tetlock 2000). Otherwise, a little research suggests that policymakers in general are susceptible to the same cognitive biases that affect everyone, particularly loss aversion, which may make policymakers irrationally unwilling to end ineffective programs or start proven but novel ones (Levy 2003; McDermott 2004). On the whole, little psychological research studies how policymakers react to new information.
Overall, this literature offers some broad classes of strategies that have worked in some contexts and that can be refined and selected based on intuition and experience. At the very least, I think those promoting reason and evidence can take heart that research and science do seem to matter, even if it's hard to see.
Bibliography:
Banuri, Sheheryar, Stefan Dercon, and Varun Gauri. "Biased policy professionals." (2017).
Blomberg, S. Brock, and Joseph E. Harrington. "A theory of rigid extremists and flexible moderates with an application to the US Congress." American Economic Review 90.3 (2000): 605-620.
Chinman, Matthew, et al. "Can implementation support help community-based settings better deliver evidence-based sexual health promotion programs? A randomized trial of Getting To Outcomes®." Implementation Science 11.1 (2016): 78.
Chinman, Matthew, et al. "Using Getting To Outcomes to facilitate the use of an evidence-based practice in VA homeless programs: a cluster-randomized trial of an implementation support strategy." Implementation Science 12.1 (2017): 34.
Choi, Bernard CK, et al. "Bridging the gap between science and policy: an international survey of scientists and policy makers in China and Canada." Implementation Science 11.1 (2016): 16.
Coburn, Cynthia E. "Rethinking scale: Moving beyond numbers to deep and lasting change." Educational researcher 32.6 (2003): 3-12.
Coburn, Cynthia E. "The role of nonsystem actors in the relationship between policy and practice: The case of reading instruction in California." Educational Evaluation and Policy Analysis 27.1 (2005): 23-52.
Coburn, Cynthia E., William R. Penuel, and Kimberly E. Geil. "Practice Partnerships: A Strategy for Leveraging Research for Educational Improvement in School Districts." William T. Grant Foundation (2013).
Coburn, Cynthia E., Judith Touré, and Mika Yamashita. "Evidence, interpretation, and persuasion: Instructional decision making at the district central office." Teachers College Record 111.4 (2009): 1115-1161.
DellaVigna, Stefano, and Devin Pope. Predicting experimental results: who knows what?. No. w22566. National Bureau of Economic Research, 2016.
Detsky, Allan S. "Sources of bias for authors of clinical practice guidelines." Canadian Medical Association Journal175.9 (2006): 1033-1033.
Fishman, Barry J., et al. "Design-based implementation research: An emerging model for transforming the relationship of research and practice." National Society for the Study of Education 112.2 (2013): 136-156.
Forman-Hoffman, Valerie L., et al. "Quality improvement, implementation, and dissemination strategies to improve mental health care for children and adolescents: a systematic review." Implementation Science 12.1 (2017): 93.
Head, Brian W. "Reconsidering evidence-based policy: Key issues and challenges." (2010): 77-94.
Grimshaw, Jeremy M., et al. "Knowledge translation of research findings." Implementation science 7.1 (2012): 50.
Hodder, Rebecca Kate, et al. "Developing implementation science to improve the translation of research to address low back pain: A critical review." Best Practice & Research Clinical Rheumatology (2017).
Hoppe, Robert, and Margarita Jeliazkova. "How policy workers define their job: A Netherlands case study." The work of policy: An international survey (2006): 35-60.
Kahan, Dan M., et al. "Motivated numeracy and enlightened self-government." Behavioural Public Policy 1.1 (2017): 54-86.
Langer L, Tripney J, Gough D (2016). The Science of Using Science: Researching the Use of Research Evidence in Decision-Making. London: EPPI-Centre, Social Science Research Unit, UCL Institute of Education, University College London.
Levi, Ariel, and Philip E. Tetlock. "A cognitive analysis of Japan's 1941 decision for war." Journal of Conflict Resolution24.2 (1980): 195-211.
Levitt, Steven D. "How do senators vote? Disentangling the role of voter preferences, party affiliation, and senator ideology." The American Economic Review (1996): 425-441.
Matias, J Nathan. “Governing Human and Machine Behavior in an Experimenting Society.” Massachusetts Institute of Technology, 2017.
McDermott, Rose. "Prospect theory in political science: Gains and losses from the first decade." Political Psychology 25.2 (2004): 289-312.
Mellers, Barbara, et al. "The psychology of intelligence analysis: Drivers of prediction accuracy in world politics." Journal of experimental psychology: applied 21.1 (2015): 1.
Murthy, Lakshmi, et al. "Interventions to improve the use of systematic reviews in decision‐making by health system managers, policy makers and clinicians." The Cochrane Library (2012).
Ortoleva, Pietro, and Erik Snowberg. "Overconfidence in political behavior." American Economic Review 105.2 (2015): 504-35.
Ortoleva, Pietro, and Erik Snowberg. "Are conservatives overconfident?." European Journal of Political Economy 40 (2015): 333-344.
Salbach, Nancy M., et al. "Facilitated interprofessional implementation of a physical rehabilitation guideline for stroke in inpatient settings: process evaluation of a cluster randomized trial." Implementation Science 12.1 (2017): 100.
Schneider, Eric C., et al. "Does a quality improvement campaign accelerate take-up of new evidence? A ten-state cluster-randomized controlled trial of the IHI’s Project JOINTS." Implementation Science 12.1 (2017): 51.
Staw, Barry M., and Jerry Ross. "Commitment in an experimenting society: A study of the attribution of leadership from administrative scenarios." Journal of Applied Psychology65.3 (1980): 249.
Tatsioni, Athina, Nikolaos G. Bonitsis, and John PA Ioannidis. "Persistence of contradicted claims in the literature." Jama298.21 (2007): 2517-2526.
Tetlock, Philip E. "Cognitive style and political belief systems in the British House of Commons." Journal of Personality and Social Psychology 46.2 (1984): 365.
Tetlock, Philip E. "Integrative complexity of American and Soviet foreign policy rhetoric: A time-series analysis." Journal of Personality and Social Psychology 49.6 (1985): 1565.
Tetlock, Philip E., David Armor, and Randall S. Peterson. "The slavery debate in antebellum America: Cognitive style, value conflict, and the limits of compromise." Journal of Personality and Social Psychology 66.1 (1994): 115.
Tetlock, Philip E. "Cognitive biases and organizational correctives: Do both disease and cure depend on the politics of the beholder?." Administrative Science Quarterly 45.2 (2000): 293-326.
Tetlock, Philip E. Expert political judgment: How good is it? How can we know?. Princeton University Press, 2005.
Tetlock, Philip E. "Second thoughts about expert political judgment: Reply to the symposium." Critical Review 22.4 (2010): 467-488.
Tetlock, Philip, and Barbara Mellers. "Judging political judgment." Proceedings of the National Academy of Sciences111.32 (2014): 11574-11575.
Tetlock, Philip E., and Dan Gardner. Superforecasting: The art and science of prediction. Random House, 2016.
Tschoegl, Adrian E., and J. Scott Armstrong. "Review of: Philip E. Tetlock. 2005. Expert Political Judgment: How Good is it? How Can We Know?." (2008).
Weiss, Carol Hirschon, Erin Murphy-Graham, and Sarah Birkeland. "An alternate route to policy influence: How evaluations affect DARE." American Journal of Evaluation 26.1 (2005): 12-30.
Weiss, Carol H. "Knowledge creep and decision accretion." Knowledge 1.3 (1980): 381-404.
Weiss, Carol H. "Research for policy's sake: The enlightenment function of social research." Policy analysis (1977): 531-545.
Weiss, Carol H. "The haphazard connection: social science and public policy." International Journal of Educational Research 23.2 (1995): 137-150.
Weiss, Carol H. "The many meanings of research utilization." Public administration review 39.5 (1979): 426-431.
Wilson, Paul M., et al. "Does access to a demand-led evidence briefing service improve uptake and use of research evidence by health service commissioners? A controlled before and after study." Implementation science 12.1 (2017): 20.
undefined @ 2018-09-09T20:36 (+1)
Very interesting, thanks so much for posting! Some questions: What seems to be the relation between policymakers' and the general public's worldview? Does it appear to be a bidirectional relationship, or is one group more important for influencing the others' views? If the general public has influence on the policymakers' worldview, is focusing on changing worldviews in general, as opposed to focusing on policymakers specifically, something that should be considered?
undefined @ 2018-09-05T00:02 (+1)
Addition to the bibliography: Paul Sabatier's work on Policy Learning. https://paulcairney.wordpress.com/2013/10/30/policy-concepts-in-1000-words-the-advocacy-coalition-framework/