Panel Paper: Using Design-Based Methods for Evidence Reviews

Friday, November 3, 2017
Dusable (Hyatt Regency Chicago)

*Names in bold indicate Presenter

Lauren Scher, Concentric Research & Evaluation


Design-based methods have been recently developed to estimate impacts for RCTs and basic QEDs. These methods are a viable alternative to model-based methods (such as HLM) and have some theoretical advantages. However, the design-based approach will only be of practical use to researchers and policymakers if it can be used for evidence reviews such as those conducted by the What Works Clearinghouse (WWC).

This presentation will summarize results from a paper that discusses issues to consider when structuring datasets and specifying impact models when using the new RCT-YES software (www.rct-yes.com) to present design-based results that will feed into formal evidence reviews such as the WWC. The paper uses RCT-YES, because it is a convenient and free software package for estimating and reporting impacts using design-based methods for a full range of designs used in social policy research.

The paper provides suggestions for RCT-YES users to analyze data and report findings that are more likely to meet review standards. It discusses key study design and analysis requirements common across most evidence reviews, focusing on sample attrition and baseline equivalence requirements. It provides suggestions for ways to structure analytic datasets to enable RCT-YES to calculate sample attrition in accordance with evidence review requirements. It also discusses ways to use RCT-YES to output key baseline equivalence information that is commonly required reporting for evidence reviews in cases where a study employs a QED design or is a RCT with high attrition. The paper discusses specifying particular covariates in the model that may be required to meet evidence standards, as well as covariate imputation considerations when using RCT-YES. The paper provides suggestions for specifying relevant outcome domains and using tables generated in RCT-YES to report statistics needed for evidence reviews.

The paper finds that while RCT-YES was not designed specifically for evidence review, it can present results that meet the requirements for evidence reviews. The strength of RCT-YES is in its flexibility and its ability to report impact estimates under a variety of assumptions. Users can tailor their analyses to ensure that they comply with particular evidence standards.