Panel Paper: Implementation and Impacts of a Professional Development and Coaching Intervention to Help Teachers Use Data

Thursday, November 7, 2019
Plaza Building: Concourse Level, Governor's Square 15 (Sheraton Denver Downtown)

*Names in bold indicate Presenter

Philip Gleason1, Sarah Crissey1, Gregory Chojnacki1, Marykate Zukiewicz1, Tim Silva1, Sarah Costelloe2 and Fran O'Reilly3, (1)Mathematica, (2)Abt Associates, Inc., (3)Evidence-Based Education Research & Evaluation


Background
As part of school improvement efforts, educators have increasingly turned to the use of data to improve their instruction. However, evidence on the promise of data driven instruction (DDI) is mixed: rigorous studies of specific DDI interventions have found positive, null, and negative impacts on teacher practice and student achievement (Carlson, Borman, and Robinson 2011; Cordray et al. 2012; Slavin et al. 2013; Konstantopoulos, et al. 2013; Cavaluzzo et al. 2014; Konstantopoulos et al. 2016; West, Morton, and Herlihy, 2016). This study used a random assignment design to examine impacts on teacher practice and student achievement from a DDI intervention focused on training teachers and school leaders in using data to improve instruction.

Intervention
The specific DDI intervention examined was implemented by an experienced provider of DDI-related services. The intervention involved a half-time data coach for each school and consultants who worked with the schools and their data coaches. Together, these individuals engaged in professional development and technical assistance for data coaches and school leaders, and targeted school-level activities to support school leaders and teachers in data use.

Sample
The study recruited 12 districts that currently administered summative and interim assessments and did not currently use DDI. The study sample included 102 schools in these 12 districts; 102 principals; 470 full-time 4th and 5th grade math and English/language arts teachers; and 12,535 4th and 5th grade students in study schools as of spring 2016.

Design
The study used a matched-pair, cluster-randomized experimental design, randomly assigning schools within each study district to treatment and control groups. Schools in the treatment group received a half-time data coach and DDI professional development from the intervention provider. Schools in the control group proceeded with business as usual. Each group of schools was followed from December 2014 through June 2016, and outcomes for each group were compared at the end of this period. Three types of data were included: (1) data coach interviews and weekly logs for treatment schools; (2) surveys of principals and teachers in all schools; and (3) student-level administrative data from all schools.

Analysis
The study included both implementation and impact analyses. Implementation analyses described the activities in treatment schools and differences between treatment and control schools in data-related activities. The impact analysis examined the DDI intervention’s effects on intermediate outcomes and student achievement, in the full sample as well as in subgroups of students and schools. We used a linear regression model to compare outcomes of the treatment and control groups while accounting for the clustered, blocked random assignment design. For student outcomes, the model included covariates to improve the estimates’ precision and account for any baseline differences between students in treatment and control schools.

Findings
Study findings have been submitted in a final report to the U.S. Department of Education’s Institute for Education Sciences. Because the report has not yet been released, we are unable to share the findings at this time. We expect the report to be released well before the deadline for the annual meeting.

Full Paper: