Panel: Building Evidence on What Works in Teen Pregnancy Prevention
(Family and Child Policy)

Thursday, November 2, 2017: 8:30 AM-10:00 AM
McCormick (Hyatt Regency Chicago)

*Names in bold indicate Presenter

Panel Organizers:  Randall Juras, Abt Associates, Inc.
Panel Chairs:  Kathleen McCoy, U.S. Department of Health & Human Services
Discussants:  Lisa Trivits, U.S. Department of Health & Human Services

OAH Efforts to Build the Evidence Base on Teen Pregnancy
Amy Farb, U.S. Department of Health & Human Services

Expanding the Evidence Base through Replication: Findings from the Multi-Site Teen Pregnancy Prevention Replication Study
Meredith Kelsey1, Kimberly Francis1 and Jean Layzer2, (1)Abt Associates, Inc., (2)Belmont Research Associates

Meta-Analysis of Federally-Funded Teen Pregnancy Prevention Programs
Randall Juras1, Emily Tanner-Smith2, Meredith Kelsey1, Mark W. Lipsey2 and Jean Layzer3, (1)Abt Associates, Inc., (2)Vanderbilt University, (3)Belmont Research Associates

Beginning in 2010, the U.S. Department of Health and Human Services (HHS) invested considerable resources in an effort to expand the evidence base for teen pregnancy prevention interventions. First, HHS pushed to identify what works through a systematic evidence review, and devoted a majority of teen pregnancy prevention (TPP) grant funding through the Office of Adolescent Health (OAH) to support the implementation of programs with evidence of effectiveness. Second, HHS generated new evidence through high-quality impact evaluations of a large number of these grant-funded programs. The large number of studies sponsored by OAH beginning in 2010—more than 40 rigorous evaluations of evidence-based programs, program models with innovative and untested additions, and previously untested approaches—represents a dramatic (and purposeful) increase in the evidence base, and was motivated in part by the need for more independent evaluations to determine the effectiveness of such programs in real-world conditions and with broader populations. The large number of studies also represents a remarkable opportunity to learn about what works for whom, by examining variation across studies in program effectiveness and linking it to variation in program and participant characteristics. Presenters in this panel will describe lessons learned from three distinct approaches to exploring cross-study variation: (1) Incorporating multiple study sites into an impact analysis; (2) systematic review of study findings; and (3) meta-regression using both aggregate study-level data and individual participant data. To provide context for these presentations a senior evaluation specialist at OAH will discuss OAH’s overall approach to ensuring evaluation rigor, consistency in outcome measures, and transparency in releasing evaluation findings – without which these cross-study efforts would not have been possible.

See more of: Family and Child Policy
See more of: Panel