Panel Paper: Online Course-Taking and Student Performance in High Schools

Saturday, November 5, 2016 : 2:25 PM
Columbia 6 (Washington Hilton)

*Names in bold indicate Presenter

Cassandra Hart1, Brian Jacob2, Susanna Loeb3, Daniel Berger2 and Demetra Kalogrides3, (1)University of California, Davis, (2)University of Michigan, (3)Stanford University


While the use of online courses has grown considerably among K-12 students, little is known about how those courses affect student learning.  Most studies that look at these questions find somewhat negative outcomes for student who take courses virtually; however, these tend to be limited to looking at only one type of course (e..g, remedial Algebra; Heppen et al., 2013) or one type of virtual school (e.g., virtual charter schools [Woodworth et al., 2015]). Our study explores how online course-taking affects student performance across a greater range of courses and in a variety of types of virtual schools. We use multiple measures of performance, including contemporaneous course performance, enrollment in follow-on courses, follow-on course grades, and—for some subjects—standardized test scores. 

Data and Sample

We use administrative data from the Florida Department of Education.  This data includes grades for high school students and test scores for students in grades and subjects associated with state standardized tests.  We can also identify the instructional institution that receives credit for offering specific courses, enabling us to observe when virtual schools provide instruction to students for particular classes, even when they attend a brick-and-mortar school for most of their classes.

Our main sample will constitute 9th and 10th graders enrolled from 2008-09 through 2011-12 in at least one of three target subjects: Algebra I, Biology, and Spanish I.  However, we will also draw on data from previous years to characterize students’ performance on 8thgrade standardized tests, and data from the 2012-13 school-year to characterize students’ outcomes for follow-on classes.

Methods and Measures

We use fixed effect analyses to compare outcomes for students who take a given course virtually versus in face-to-face settings.  Our main analyses rely on year-by-home institution fixed effect comparisons. Home institutions are the brick-and-mortar schools where students take most of their classes.  For each class we study, we compare outcome Y for student i in home institution h in year t based on whether the class was taken online.

We control for student characteristics StudentChar (including subsidized lunch use, disability status, limited English proficiency status, race, sex, and 8th grade test scores). Fixed effects π are included for grade (g), year (t), and home institution-by-year; the home institution-by- fixed effects absorb all time-invariant and time-variant school characteristics.  Outcomes will include scores on standardized tests (for Algebra and Biology), contemporaneous course performance (course passing), and student outcomes in follow-on courses.

We will look at both future enrollment in follow-on courses and next-course performance. For instance, we will look at the probability that a student who enrolled in Spanish I online continues to Spanish II, compared to their same-school peers taking Spanish I face-to-face in the same year. Among students who enroll in Spanish II, we will further look at next-course performance.

We will explore how results differ by student characteristics, by characteristics of home institutions, and by the mode of instruction for follow-on courses. These results should substantially improve our knowledge of how virtual courses affect learning for high-school students.