| Literature DB >> 32541050 |
René F Kizilcec1, Justin Reich2, Michael Yeomans3, Christoph Dann4, Emma Brunskill5, Glenn Lopez6, Selen Turkay7, Joseph Jay Williams8, Dustin Tingley6,9.
Abstract
Online education is rapidly expanding in response to rising demand for higher and continuing education, but many online students struggle to achieve their educational goals. Several behavioral science interventions have shown promise in raising student persistence and completion rates in a handful of courses, but evidence of their effectiveness across diverse educational contexts is limited. In this study, we test a set of established interventions over 2.5 y, with one-quarter million students, from nearly every country, across 247 online courses offered by Harvard, the Massachusetts Institute of Technology, and Stanford. We hypothesized that the interventions would produce medium-to-large effects as in prior studies, but this is not supported by our results. Instead, using an iterative scientific process of cyclically preregistering new hypotheses in between waves of data collection, we identified individual, contextual, and temporal conditions under which the interventions benefit students. Self-regulation interventions raised student engagement in the first few weeks but not final completion rates. Value-relevance interventions raised completion rates in developing countries to close the global achievement gap, but only in courses with a global gap. We found minimal evidence that state-of-the-art machine learning methods can forecast the occurrence of a global gap or learn effective individualized intervention policies. Scaling behavioral science interventions across various online learning contexts can reduce their average effectiveness by an order-of-magnitude. However, iterative scientific investigations can uncover what works where for whom.Entities:
Keywords: behavioral interventions; online learning; scale
Mesh:
Year: 2020 PMID: 32541050 PMCID: PMC7334459 DOI: 10.1073/pnas.1921417117
Source DB: PubMed Journal: Proc Natl Acad Sci U S A ISSN: 0027-8424 Impact factor: 11.205
Fig. 1.Average student activity (count of course platform events) in the first 3 wk after exposure to each intervention. Points show covariate-adjusted means on a logarithmic scale (to match the log-transformed outcome in the regression model) with cluster-robust SE bars.
Comparison of intervention results from prior research and this research for comparable interventions and subgroups of students
| Intervention | Subpopulation | Prior result | Present result |
| Plan-making (long-term) | Committed English-fluent students | β = 3.9 pp, | Year 1: β = 0.19 pp, |
| Year 2: β = −0.23 pp, | |||
| Value-relevance | Students in less-developed countries in courses with a global gap | Study 1: β = 3.4 course activities, | Year 1: β = 2.79 pp, |
| Study 2: β = 24 pp, | Year 2: β = 2.74 pp, | ||
| Mental contrasting with implementation intentions | Students in individualistic countries | Study 1: β = 1.8 pp, | Year 2: β = 0.25 pp, |
| Study 2: β = 3.9 pp, |
Note that there are several differences between the prior and present research in terms of the implementation of intervention instructions and sample exclusion criteria. Effects denote percentage point (pp) increases in course completion except where noted.
Fig. 2.Average course completion rate in all waves in the value-relevance intervention and control condition by student context (more vs. less developed country) and course context (with vs. without global achievement gap). Bars show covariate-adjusted means with cluster-robust SE bars.