Panel Paper: A Mixed-Methods Approach for Measuring Policy Implementation and Impact Using Grantee Reporting Documents

Friday, November 3, 2017
Dusable (Hyatt Regency Chicago)

*Names in bold indicate Presenter

Miriam Jacobson1, Andrew MacDonald1, Astrid Hendricks1, Craig Kinnear2 and Anthony Nerino2, (1)ICF International, Inc., (2)Corporation for National and Community Service


Public agencies targeting poverty in the United States frequently require grantees to submit regular progress reports to describe how they are implementing their funded projects. Typically, these reports include narrative descriptions of their projects’ implementation, outcomes, and barriers to achievement of results. However, the qualitative data contained in these reports are often overlooked or underutilized in evaluation of the agencies’ organizational policies, especially those that potentially have an impact on how funding is allocated.

Some concerns about using qualitative grantee report data are that they can be inconsistently completed and biased, particularly since they are written by the grantees themselves, or that the qualitative data in reports are not conducive to rigorous evaluation. These concerns often mean that these reports are overlooked when public agencies undertake evaluations. In fact, there is a wealth of literature about mixed-methods analysis that provides techniques for rigorously analyzing this type of qualitative data and using it as a valuable complement to quantitative data to evaluate organizational policy. Particularly when evaluation time or resources are limited, qualitative reports are a readily available source of data for assessing barriers or facilitators to organizational policy effectiveness, identifying unintended outcomes, and capturing local stakeholder perspectives. This paper will discuss the potential of using mixed-methods analysis of grantee documents to provide strong evidence to help inform evaluation and organizational policy decisions.

This paper will highlight rigorous strategies for grantee report analysis used in a study conducted with the Corporation for National and Community Service (CNCS)’s VISTA program. CNCS is a federal agency that helps millions of Americans improve the lives of their fellow citizens through service. Their VISTA model uses volunteer service to strengthen the organizational capacity of nonprofit organizations or public agencies fighting poverty in the areas of education, health, and economic development. Like many government funders, CNCS monitors the work of its grantees using various progress tracking tools, such as performance and activity reports.

The study examined the quality of narrative data in grantee reports, the type of project activities and impacts reported, and the feasibility of an approach for analyzing report data to evaluate the VISTA model. The study used a pre-post design to examine differences in report quality and the type of impacts reported before and after a change in VISTA grantee reporting policy, across a seven-year time period. Mixed-methods analyses of narrative data in 90 grantees’ reports yielded quantitative and qualitative findings about the reports’ content and quality, and differences by grantee characteristics.

In our paper, we discuss how the study used a multi-stage, collaborative approach to ensure findings from analysis were accurate and useful for understanding VISTA’s overall impact and informing policy about VISTA programs and measurement. We will show how the results obtained through this data compared to those found in prior studies of VISTA performance that relied on alternative methods, such as surveys. We will share broader implications, particularly how grantee reporting documents could be more effectively utilized as part of an organizational evaluation and policy agenda.