Panel Paper: The Effect of Providing Teacher Value-Added Information to Schools

Thursday, November 8, 2018
Jefferson - Mezz Level (Marriott Wardman Park)

*Names in bold indicate Presenter

Hwanoong Lee, Michigan State University


Recently, teacher value-added scores has become an increasingly common information source for evaluating teachers. Many school districts and states across the country have begun to produce this information and provide it to schools. In this paper, I conduct a two-pronged empirical analysis of the impact of providing teacher value-added information on student achievement by examining the policy changes in two urban districts of North Carolina. Since Guilford County Schools (hereafter, Guilford) provides value-added information to all potential employers within the district and Winston-Salem/Forsyth County Schools (hereafter, Winston-Salem) releases this information to the current employer only, comparing these two natural experiments allows us to understand the effect of providing performance information in different settings.

The first empirical analysis evaluates the mean effects of providing value-added information on students' academic achievement. Across myriad specifications in modeling choices that minimize endogeneity, I find that adopting value-added information raises student math achievement only in Guilford where the performance information is provided to all potential employers.

The overall achievement effects that I document for Guilford, however, do not necessarily suggest that teachers in this district respond to the new information and increase their effort level, as principals may use this information strategically to lay off less effective teachers. To determine the factors influencing the different results between the two districts, the second component of my analysis concentrates on whether providing value-added information to teachers changes the teachers' impact on student achievement gains. Using a value-added model that includes student-, classroom-, and school-level controls, I track the impact of having teachers with one standard deviation higher value-added scores on student achievement over time. I find evidence that when teachers were informed of their value-added, the performance gap between high- and low-scoring teachers declined in Guilford only.

My results clearly show that low-value-added teachers in Guilford had higher test score gains when the district adopted value-added information, while low value-added teachers in Winston-Salem did not have test score gains after the policy change. I argue that these results should not be interpreted as indicating that the low-value-added teachers in the two districts responded to the value-added scores, but teachers in Winston-Salem did not know how to improve student achievement. For example, the demographic composition of teachers may be different between the two districts. However, I show that the estimates of having teachers with one standard higher value-added score on student achievement gains in the two districts, conditional on various teacher characteristics, are similar to the baseline estimates.

Hence, my preferred interpretation is that the individual incentive to respond to the value-added information is ineffective in Winston-Salem as Guilford allowed principals to access all teachers' value-added reports within the districts, while Winston-Salem permitted principals to access the reports of teachers in their schools only. Hence, teachers in Winston-Salem may regard value-added information as private information and may not have strong incentives to respond to this measure. This finding has implications for the effectiveness of providing performance information as useful policy tools to improve the performance of teachers.

Full Paper: