*Names in bold indicate Presenter
This study assesses the relationship between student test score-based measures of teacher effectiveness (often called “Value Added Models” of teacher effectiveness) and an observation-based measure of teacher effectiveness that is part of the Los Angeles Unified School District’s (LAUSD) Initial Implementation Phase (IIP) of a new MMTES, called the Educator Growth and Development Cycle (EGDC). Using value-added measures and classroom observation scores from the approximately 200 teachers who enrolled in the district’s IIP and who had VAMs, we answer two questions about the relationships between these measures and between specific instructional practices captured in the classroom observation and value-added measures. Specifically, we ask:
- When implemented as a part of a standards based multiple measure teacher evaluation system, are value-added measures of teacher effectiveness associated with observation-based measures of effectiveness?
- Do specific classroom practices measured by a district-generated observation rubric capture differences in teacher effectiveness, as measured by value-added scores?
We find that there are significant and positive relationships between the value-added measure and the observation-based measure of teacher effectiveness, of similar magnitude to those found in studies in research-based settings. In addition, we show that teachers with higher value-added measures of effectiveness in both math and ELA score particularly well on ratings of multiple teaching practices garnered from the observational measure. However, we find that relationships between test-score and observation-based measures of effectiveness differ depending on the observation cycle from which observation scores are taken (many of the new MMTES include two or more observation cycles) and the number of elements within the observation rubric on which teachers are observed. Given these results, we discuss the decisions that district administrators and school boards must make when implementing MMTES in the district context as opposed to in experimental settings that have significant implications for teachers' evaluation outcomes. Our results provide new evidence based on actual implementation of an MMTES in LAUSD, and may help inform policy as districts across the country work to implement expanded evaluation systems.