Panel Paper: Learning ‘What Works' From Multi-Site Experiments By Combining Natural Site-Level Variation With Randomized Individual-Level Variation in Program Features

Saturday, November 9, 2013 : 9:45 AM
Mayfair Court (Westin Georgetown)

*Names in bold indicate Presenter

Laura Peck, Stephen Bell and Alan Werner, Abt Associates, Inc.
What Works?” questions are frequently addressed in randomized impact evaluations by relating site-to-site differences in impacts to site-to-site differences in program features.  Occasionally, within-site randomization between program features illuminates this question experimentally.  The DHHS-funded Health Profession Opportunity Grants (HPOG) Program impact evaluation provides the opportunity to combine both methodologies to get “inside the black box” of what makes career pathways training programs successful.  HPOG is a demonstration program providing health-sector training in a career pathways framework to those otherwise disadvantaged in the labor market.  In multiple locations nationwide, the evaluation is randomizing eligible program applicants to a treatment group offered access to HPOG training and a control group not offered the opportunity.  In addition, for a subset of grantees three groups of eligible applicants will be established at random, including a group randomized to an “enhanced” version of the HPOG program with added intervention features.  The proposed paper will describe the strengths of the multi-site impact study for tackling this question and introduce new analysis methodologies that capitalize on natural and random variation of program features.  It will describe these design innovations, outline the study’s analytic approach to rigorously analyzing the “What Works?” question, and advance the field of inside-the-black-box impact analysis generally.