Indiana University SPEA Edward J. Bloustein School of Planning and Public Policy University of Pennsylvania AIR American University

Roundtable: Using Evidence Reviews to Inform Policy: Promise and Cautions
(Tools of Analysis: Methods, Data, Informatics and Research Design)

Thursday, November 12, 2015: 1:45 PM-3:15 PM
Foster I (Hyatt Regency Miami)

*Names in bold indicate Presenter

Roundtable Organizers:  Annalisa Mastri, Mathematica Policy Research
Moderators:  Annalisa Mastri, Mathematica Policy Research
Speakers:  A. Brooks Bowden, Columbia University, Brian Goesling, Mathematica Policy Research, Sean Tanner, University of California, Berkeley and Eva Vivalt, New York University

Over the past 15 years, numerous federally-funded systematic reviews have been established to provide a trusted source of information about what works in topics ranging from educational interventions to criminal justice to medical practice. Often, the results of the systematic reviews are published on a public website, available to practitioners seeking to implement a new program or intervention, and to policymakers and federal staff seeking to fund effective programs. This roundtable discussion will highlight the promise that systematic reviews hold for informing policy, cautions about using the reviews, and enhancements that can further increase reviews’ utility. Particular attention will be paid to the difference between low stakes reviews, which focus on providing information that is accessible to practitioners and policymakers but have no funding specifically tied to the reviews’ results, and high stakes reviews, which seek to identify effective programs and use the reviews’ findings to determine public funding decisions. Annalisa Mastri, principal investigator of the Office of Planning, Research, and Evaluation’s (OPRE’s) Employment Strategies for Low-Income Adults Evidence Review (ESER) will introduce the concept of a systematic review and moderate the discussion. She will use ESER as an example of a low-stakes systematic review that was designed specifically for practitioner use, focusing on specific steps taken to ensure that the results of the review are easy to access and disseminate. Speaker Brian Goesling, principal investigator of OPRE’s Teen Pregnancy Prevention Evidence Review, will then introduce that high stakes evidence review, discussing how its results have already been used to directly inform policy. The three other speakers will then introduce themselves and summarize their work. Eva Vivalt, of New York University, founded AidGrade, an organization that systematically collects and synthesizes impact evaluation results on a wide variety of interventions in international development. Her recent research examined and developed a correction for specification searching and publication bias, issues which could affect the credibility of results from systematic reviews. Sean Tanner, of UC-Berkeley, has expertise on “p-hacking” and its potential consequences for systematic reviews. This phenomenon can allow false positives—even in randomized controlled trials—to masquerade as strong findings when a researcher selectively reports only statistically significant results and leaves numerous “failed” analyses unreported. Finally, Brooks Bowden, of the Center for Benefit-Cost Studies of Education, will introduce the center’s work examining the cost-effectiveness of interventions for high school completion and early literacy using the results of the What Works Clearinghouse’s systematic reviews. This valuable enhancement to systematic evidence reviews can provide more actionable data to policymakers because it incorporates cost information, and not just effect sizes. The moderated discussion will be highly interesting to policymakers and practitioners; therefore we intend to maximize audience discussion. Topics will include the factors that evidence review leaders could consider to make their results as useful as possible; what steps, if any, are taken to mitigate p-hacking or publication bias in the reported results of systematic reviews; and how the results of systematic reviews might be made even more useful to inform policy.