Panel Paper: Attaining Causal Validity Through Multiple Design Elements: A Retrospective Impact Evaluation of the New Heights Program for Expectant and Parenting Teens in Washington, DC

Thursday, November 2, 2017
McCormick (Hyatt Regency Chicago)

*Names in bold indicate Presenter

John Deke and Susan Zief, Mathematica Policy Research

This presentation will demonstrate how data from multiple administrative sources can be used to retrospectively assess program impacts, providing credible findings to inform program and policy decisions. Although the study is not based on a prospective randomized experiment, we obtain credible impact estimates by leveraging a natural experiment, employing two comparison groups, and using administrative data to construct both proximal and distal outcomes aligned with the program’s logic model.

New Heights

New Heights, a DC Public Schools program, supports parenting students’ educational attainment. The program’s key feature is placing a coordinator in every high school who is responsible for integrating four primary program components into the school day: advocacy, case management, workshops, and incentives.


The evaluation is based on a natural experiment created by an expansion of the program. In 2011, with the assistance of an Office of Adolescent Health (OAH) Pregnancy Assistance Fund grant, the program expanded to nine high schools in Washington, DC. The study combines 8 years of data (4 pre-expansion; 4 post-expansion) from the DC Department of Health (DCDOH), DC Public Schools (DCPS), and DC Department of Human Services (DCDHS). We merged birth records from DCDOH with enrollment data from DCPS to identify parenting females attending expansion schools. DCPS provided outcome data (attendance, credits, and graduation) for both parenting and non-parenting females. DCDHS data identified program participants.

Through multiple design elements, we (1) circumvent self-selection bias, (2) minimize confound bias, and (3) avert false inference from multiple hypothesis testing. Using two comparison groups and a difference-in-difference analysis, we circumvent self-selection bias by calculating the impact of the offer of New Heights on the outcomes of all parenting females (not just New Heights participants). The treatment group includes all parenting females in study schools post-expansion. The comparison groups are (1) parenting females in study schools pre-expansion and (2) non-parenting females in study schools before and after expansion. First, we calculated the difference in outcomes for parenting females before and after expansion. Second, we calculated the difference in outcomes for non-parenting females. This second difference is used to control for the effects of other contemporaneous changes in DCPS, which minimizes confound bias. We avert false inferences from multiple hypothesis testing by calculating impacts on outcomes aligned with the intervention’s logic model, and by adjusting p-values for multiple hypothesis testing within outcome domains. The impact of New Heights is the difference in these differences. We verified findings are not idiosyncratic to our chosen analytic approach by recalculating impacts using 22 alternative approaches. We also calculate the impact of treatment on the treated.

Influential Findings

The impacts of New Heights are both statistically and substantively significant, and of immediate value to policy makers. We assess substantive significance by comparing the impact of New Heights to the gap in outcomes between parenting and non-parenting females prior to New Heights expansion. New Heights’ impacts on attendance (p<0.01), credit accumulation (p<0.01), and graduation (p<0.10) reduced the gap between parenting youth and their non-parenting peers by 28 percent, 99 percent, and 50 percent.