Panel: Design-Based Causal Inference for RCTs and QEDs: Theoretical and Empirical Advances
(Tools of Analysis: Methods, Data, Informatics and Research Design)

Friday, November 3, 2017: 3:15 PM-4:45 PM
Dusable (Hyatt Regency Chicago)

*Names in bold indicate Presenter

Panel Organizers:  Peter Z. Schochet, Mathematica Policy Research
Panel Chairs:  Thomas Wei, U.S. Department of Education
Discussants:  Luke Miratrix, Harvard University


Do HLM, RCSE, and Design-Based Estimators Differ in Practice?
Charles Tilley, Mathematica Policy Research



How Can Design-Based Methods be Extended to Multi-Armed RCTs?
Peter Z. Schochet, Mathematica Policy Research



Using Design-Based Methods for Evidence Reviews
Lauren Scher, Concentric Research & Evaluation


A recent design-based theory has been developed to estimate impacts for RCTs and basic QEDs for a wide range of designs used in social policy research (Imbens and Rubin, 2015; Schochet, 2015). This unified theory is derived directly from the non-parametric Neyman-Rubin-Holland causal inference model that underlies experimental designs. These design-based methods have important advantages over traditional impact estimation methods used in social policy research such as hierarchical linear modeling (HLM) and robust cluster standard error (RCSE) methods, and perform well in simulations (Schochet, 2016).

This proposed panel will consist of presentations from four papers that examine both theoretical and empirical issues related to design-based estimators to inform their use in practice: 

  1. What are the theoretical differences between HLM, RCSE, and design-based impact estimators? All three competing methods produce impact estimators that are consistent and normally distributed in large samples. However, the methods typically yield different variance formulas due to differences in their underlying assumptions. This paper clarifies these statistical differences, with a focus on clustered designs. 

  2. Do HLM, RCSE, and design-based estimators differ in practice? This paper builds on the first theoretical paper by using data from nine education RCTs spanning a range of designs and interventions to compare impact findings using the three competing methods, and resolving differences based on the model assumptions underlying the approaches. 

  3. How can design-based methods be extended to multi-armed RCTs? This paper presents new methods to extend design-based theory to designs with multiple treatment groups, which are becoming more popular in social policy research to test behavioral and encouragement-based interventions. 

  4. Using design-based methods for evidence reviews. This paper demonstrates how design-based methods are appropriate for meeting evidence review requirements, and in particular, What Works Clearinghouse (WWC) requirements. The paper also discusses how the free, IES-funded RCT-YES software—which estimates and reports impacts using design-based methods—can be used for evidence reviews. 

The panel will be chaired by Dr. Tom Wei, a project officer at the Institute of Education Sciences  who has been heavily involved with the oversight and funding of the papers for this panel. The discussant will be Dr. Luke Miratrix, an assistant professor at the Harvard School of Education, who is an expert on design-based methods and causal inference. 

This panel fits with the conference theme of obtaining better data for better decisions, because it presents new research on methods for analyzing such data from RCTs and QEDs for evidence-based policy. The design-based approach relies on fewer assumptions, allows for heterogeneity of treatment effects, and yields simple variance expressions, even for complex designs (increasing transparency). Thus, it is important for analysts to understand these methods and how they differ from more traditional model-based methods, both theoretically and empirically.