Indiana University SPEA Edward J. Bloustein School of Planning and Public Policy University of Pennsylvania AIR American University

Panel Paper: Student Achievement in Online Courses

Friday, November 13, 2015 : 2:30 PM
Tuttle Center (Hyatt Regency Miami)

*Names in bold indicate Presenter

Cassandra Hart, University of California, Davis, Brian Jacobs, University of Michigan and Susanna Loeb, Stanford University
The use of online courses is growing rapidly among K-12 students. Despite the burgeoning popularity of online options, little is known about how well online courses serve students in the K-12 sector. We explore how participation in online classes is associated with achievement in Florida, which has the largest virtual education sector in the country.

We draw on student-course level data spanning 2005-06 through 2013-14 provided by the Florida Education Data Warehouse. This data allows us to identify both students in full-time online schools and students who take individual courses through online providers. We observe state standardized test scores, as well as student demographic and school characteristics.

A key concern in assessing the impacts of online classes on student outcomes is that student characteristics may affect both students’ decisions of course format as well as their academic performance. We use several strategies to address this concern. 

First, we compare outcomes of virtual vs. face-to-face students using fixed-effects regression with a rich set of observable controls (e.g., time-varying student characteristics including prior test scores; school fixed effects as well as time-varying school characteristics; course-fixed effects) to mitigate bias associated with the self-selection of students into virtual classes. Initial results using these specifications (which closely follow Chingos & Schwerdt, 2014) suggest that among incoming cohorts of 9th graders, students who take virtual courses during their 9th and 10th grader years modestly out-perform their peers on 10th grade math and reading tests (by roughly .04 to .08 sd, depending on the specification). This is true whether we define virtual course-taking as having taken any course virtually, or whether we flag virtual course-taking only based on courses closely aligned with standardized tests (e.g., for the math (English Language Arts) test, virtual course-taking reflects whether a student took Algebra I (English I) online). Future iterations of this paper will include the use of student fixed effects and student-subject fixed effects to further limit the bias in our estimates.

Our second estimation strategy leverages a natural experiment that incentivized schools to enroll students in virtual courses—without student input—in order to comply with a 2002 state constitutional amendment limiting maximum class sizes in Florida schools. The amendment mandated maximum numbers of students in “core classes.” [1] In the 2010-11 school-year, for the first time each individual classroom had to meet the required ratios. However, students enrolled in virtual classes were not counted for the purpose of class size determinations. We use the combined variation in core course status and year (pre vs. post-2010-11) to instrument for online course-taking, and use these instrumented values to predict student achievement on tests and other achievement measures (e.g., course grades/progression).

These analyses should contribute considerably to our knowledge about the causal effects of online course-taking on achievement.



[1]Core subject areas include courses measured by state examinations at any grade level, or required for graduation or promotion.