Indiana University SPEA Edward J. Bloustein School of Planning and Public Policy University of Pennsylvania AIR American University

Panel: Design-Based Causal Inference for RCTs: Theory and Software for Promoting Opportunistic Experiments
(Tools of Analysis: Methods, Data, Informatics and Research Design)

Thursday, November 12, 2015: 10:15 AM-11:45 AM
Pearson I (Hyatt Regency Miami)

*Names in bold indicate Presenter

Panel Organizers:  Peter Z. Schochet, Mathematica Policy Research
Panel Chairs:  Thomas Wei, U.S. Department of Education
Discussants:  Luke Keele, Pennsylvania State University

Why Use Design-Based Methods for RCTs?
Peter Z. Schochet, Mathematica Policy Research

Estimating and Reporting Impacts Using the RCT-YES Software
Carol Razafindrakoto, Mathematica Policy Research and Alexandra Resch, Mathematica Policy Research, Inc.

There is increasing interest throughout the federal government for state and local agencies to conduct low-cost opportunistic experiments to test promising interventions and policies in their service areas. The increased availability of administrative data provide a rich data source for such evaluations. This panel will focus on several methodological tools that have been developed to facilitate the conduct of opportunistic experiments. First, the panel will present new software—called RCT-YES—funded by the Institute of Education Sciences (IES) at the U.S. Department of Education (ED) that estimates and reports impacts for RCTs for a wide range of designs used in social policy research. Second, the panel will present new research on the design-based statistical theory that underlies RCT-YES, that differs from the more model-based impact estimation methods that are typically used in social policy research. Thus, this panel fits directly with the conference theme by presenting new tools and methodology for promoting the use of evidence to test and improve federal programs. The panel will consist of three papers. The first session will discuss a unified, non-parametric design-based approach for impact estimation using the building blocks of the Neyman-Rubin-Holland causal inference model that underlies experimental designs. This presentation will be based on a peer-reviewed, forthcoming IES methods report that covers most designs used in social policy research (such as blocked and clustered designs). The second session will demonstrate the RCT-YES software—which has been programmed in R, SAS, and STATA—using the software’s desktop application input screens. This session will use a real-world dataset to demonstrate the analyses that the software can conduct (such as impact, subgroup, and baseline equivalency analyses) and the formatted output tables that it generates. The final paper will extend the Neyman framework to allow for heterogeneous treatment effects, and propose methods for measuring and characterizing such effects, including an R2 for treatment variation. This formulation provides some advantages such as estimating the overall average effect with greater precision, and testing for heterogenous effects not explained by covariates, which could signal, for example, a lack of generalizability. Based on the Neyman randomization framework, these approaches could be implemented in RCT-YES in future updates. The panel will be chaired by the Associate Commissioner at IES, who is the project officer for the development of RCT-YES, the associated statistical theory document, as well as other IES products to help support the conduct of opportunistic experiments. The discussant for the panel is a national expert on causal inference. All together, the purpose of this panel is to bring to the forefront new impact estimation methods that are based solely on the randomization mechanism that underlie experimental designs, as well as to introduce new easy-to-use software, based on this theory that can be used to help facilitate the conduct of rapid cycle RCTs by government agencies and others to help build the evidence base for identifying effective policies and interventions.