Panel Paper: The Effects of a Scheduling Intervention on Student Performance in Online Postsecondary Education

Friday, November 3, 2017
Haymarket (Hyatt Regency Chicago)

*Names in bold indicate Presenter

Rachel Baker1, Brent Evans2, Qiujie Li3, Bianca Cung3 and Di Xu3, (1)Stanford University, (2)Vanderbilt University, (3)University of California, Irvine


Numerous studies demonstrate that positive time management practices are related to increased academic performance and success in higher education (Hartwig & Dunlosky, 2012; Macan et al., 1990; Van Den Hurk, 2006). Time management is especially critical in asynchronous online coursework (Elvers, Polzella, & Graetz, 2003; Guàrdia, Maina, & Sangra, 2013; Michinov et al., 2011; Roper, 2007). Students who do not plan in advance when to watch lecture videos and work on assignments in online coursework are at elevated risk of letting other priorities crowd out online coursework, leading to poor performance and reduced persistence.

We test the effects of encouraging better time management practices by nudging students in online courses to schedule in advance when they will complete their coursework. Our treatment is a soft precommitment device that asks students to commit to a day and time when they will complete lecture watching or course assignments. This paper reports on a series of randomized control trials of this commitment device in different online course contexts: in free Massive Open Online Courses (MOOCs) that are not credit bearing and open to anyone with an internet connection, and in credit bearing courses for students enrolled in institutions of higher education.

By measuring within course behaviors such as the time students watch lecture videos and complete assignments, we observe how the suggestion to schedule affects not only course performance but also levels of procrastination and cramming. The detailed level of course interaction data available in online courses enables robust descriptive analysis of time use in courses, which complements the results of the experimental scheduling intervention. We also use students responses from pre-course surveys to test heterogeneous treatment effects across students with different self-reported levels of time management skills. Finally, we report on the effects of withdrawing the scheduling treatment to assess whether the scheduling intervention improved time management practices even after it was withdrawn. The long-term goal is to provide an intervention that teaches time management skills that students can continue to implement without explicitly repeated instructions to do so.

Preliminary results from a subset of the RCTs suggest the scheduling intervention improves initial course performance but then causes negative performance on assignments at the end of the course and final course grade. We see no evidence of it directly altering procrastination and cramming, two hypothesized mechanisms of the treatment’s effect.

Better understanding the efficacy of low-cost, scalable interventions such as our scheduling nudge is crucial for policymakers and practitioners weighing future investments in student success in higher education.