Panel Paper: Supporting Rigorous Evaluations: An Evaluator's Perspective

Saturday, November 5, 2016 : 10:55 AM
Columbia 6 (Washington Hilton)

*Names in bold indicate Presenter

Julie Edmunds, SERVE Center at UNC Greensboro


Objectives:  This paper presents an evaluator’s perspective on participation in the Investing in Innovation program. It will describe the development of my understanding of the expectations of the i3 program and how those expectations affected my interactions with the project developer.  It will also describe my experiences with the technical assistance provided as part of the project. 

Perspective:  I am the evaluator for five i3 projects with the first one funded in 2011. The first project (as well as three more of the subsequent projects) was a Validation grant—a mid-tier grant designed for projects that had some evidence of their effectiveness.  I had a long established relationship working with North Carolina New Schools, the project developer and the grant recipient.  I had been studying their early college work for many years using a randomized controlled trial. This research was used as part of the basis for the Validation grant application.  I had also done relatively low-stakes program evaluations for them that were more descriptive in nature.

The i3 project represented a dramatic shift in expectations for evaluation of federal grants—from more descriptive evaluations focused on implementation to more rigorous examinations of impact.  The project director’s meetings have been very helpful in communicating to the project staff that the evaluation was a critical component of this project.  Similarly, it was also important that the evaluation design was subject to review by an external organization relative to an established set of explicit criteria aligned to What Works Clearinghouse standards.  This meant that we had external justification for our methodological demands on the client. 

Throughout the project, we received technical assistance from an Abt Associates’ consultant. In my case, my TA provider and I had already worked together for six years on another project.  As a result, it was an easy transition and we considered him from the beginning as an extended member of the research team—a very smart person who could help us solve methodological challenges.

Because each of the i3 projects was a whole school reform effort working in different settings, we ended up facing methodological issues that were not clearly addressed by the What Works Clearinghouse standards and the subsequent interpretation by the Abt review team. In these cases, we were able to argue for a reconsideration of some of these standards. The final presentation will include examples of these negotiations.

Results:  The i3 expectations and the accompanying technical assistance have had a substantial effect on me and on the members of my research team. It has been strong professional development on high quality impact evaluations.  We have been able to integrate the i3 expectations into our work on other projects and into professional development and courses provided to others at our host institutions.  Some of these expectations (such as the requirement for a clear logic model and clear articulation of fidelity of implementation measures) also improved the implementation of the project itself.