Panel Paper: Addressing Missing Data of Economic Evaluations in a Partnership-Based Educational Program

Saturday, November 10, 2018
Hoover - Mezz Level (Marriott Wardman Park)

*Names in bold indicate Presenter

Viviana Rodriguez, A. Brooks Bowden, Maya Escueta and Atsuko Muroga, Columbia University

The Minnesota Reading Corps (MRC) program is a statewide initiative that aims to foster emergent reading skills of children to ensure that they become proficient readers by the end of grade 3. MRC and its host organization, ServeMinnesota Action Network (ServeMN), partner with AmeriCorps to bring volunteers into preschool (Pre-K) classrooms to provide evidence-based literacy enrichment and tutoring services. In 2013-2014, the University of Chicago-based research center, NORC, conducted an impact evaluation of the MRC Pre-K program (Diaconis et al., 2015). The impact evaluation found that the MRC Pre-K program had significant effects ranging from 0.4 to 0.72 SDs on IGDI outcome measures of emergent literacy for 4- and 5-year olds.

Rigorous economic evaluations of educational interventions provide important information about the resources necessary to implement a program. Such evaluations bridge the gap between knowledge on program implementation and program impact by identifying the resources utilized to generate outcomes of interest. As such, cost analyses intend to inform policymakers facing decisions to replicate or scale up a program, or trade-offs related to limited resources. The purpose of this study is to estimate the costs of providing the MRC Pre-K program that are associated with the impact measured by the 2013-2014 impact evaluation and assess the value that the partnership with AmeriCorps brings into the classroom through MRC.

Cost estimates are derived using the Ingredients Method retrospectively. The ingredients method is a widely accepted method to examine costs (Levin, McEwan, Belfield, Bowden & Shand, 2018). The method requires detailed data that follow the program’s theory of change, including levels of service and sources of variation in impacts. Typically, ingredients data are collected via interviews, surveys, and classroom observations. Program documents, administrative records and qualitative research (e.g., process or implementation evaluations) provide useful data, but often these sources, when used retrospectively, are incomplete. To address this challenge, we estimate average site costs by employing a Monte Carlo Simulation strategy to explore site-level variation in the program whenever data is missing.

The results provide site-level descriptions of resources. On average, the program costs $1,210 per pupil. We couple this information with the previously estimated effectiveness to provide expected estimates of cost-effectiveness as well as recommendations for prospective evaluations. Additionally, analyses of the distribution of costs across those who pay for them suggest that the average cost per student per site for resources provided or financed by the school ranges from $680 to $210, or approximately 25% of the total costs per student. Therefore, this program leverages a substantial amount of resources into schools from its partnership with MRC and AmeriCorps.