Literature DB >> 33509928

Synchronized eye movements predict test scores in online video education.

Jens Madsen1, Sara U Júlio2, Pawel J Gucik2, Richard Steinberg3,4, Lucas C Parra2.   

Abstract

Experienced teachers pay close attention to their students, adjusting their teaching when students seem lost. This dynamic interaction is missing in online education. We hypothesized that attentive students follow videos similarly with their eyes. Thus, attention to instructional videos could be assessed remotely by tracking eye movements. Here we show that intersubject correlation of eye movements during video presentation is substantially higher for attentive students and that synchronized eye movements are predictive of individual test scores on the material presented in the video. These findings replicate for videos in a variety of production styles, for incidental and intentional learning and for recall and comprehension questions alike. We reproduce the result using standard web cameras to capture eye movements in a classroom setting and with over 1,000 participants at home without the need to transmit user data. Our results suggest that online education could be made adaptive to a student's level of attention in real time.

Entities:  

Keywords:  eye tracking; intersubject correlation; online education

Mesh:

Year:  2021        PMID: 33509928      PMCID: PMC7865179          DOI: 10.1073/pnas.2016980118

Source DB:  PubMed          Journal:  Proc Natl Acad Sci U S A        ISSN: 0027-8424            Impact factor:   11.205


  28 in total

1.  When static media promote active learning: annotated illustrations versus narrated animations in multimedia instruction.

Authors:  Richard E Mayer; Mary Hegarty; Sarah Mayer; Julie Campbell
Journal:  J Exp Psychol Appl       Date:  2005-12

2.  A hierarchy of temporal receptive windows in human cortex.

Authors:  Uri Hasson; Eunice Yang; Ignacio Vallines; David J Heeger; Nava Rubin
Journal:  J Neurosci       Date:  2008-03-05       Impact factor: 6.167

3.  Eye movements and attention in reading, scene perception, and visual search.

Authors:  Keith Rayner
Journal:  Q J Exp Psychol (Hove)       Date:  2009-05-14       Impact factor: 2.143

4.  Temporal eye movement strategies during naturalistic viewing.

Authors:  Helena X Wang; Jeremy Freeman; Elisha P Merriam; Uri Hasson; David J Heeger
Journal:  J Vis       Date:  2012-01-19       Impact factor: 2.240

5.  Brain-to-Brain Synchrony and Learning Outcomes Vary by Student-Teacher Dynamics: Evidence from a Real-world Classroom Electroencephalography Study.

Authors:  Dana Bevilacqua; Ido Davidesco; Lu Wan; Kim Chaloner; Jess Rowland; Mingzhou Ding; David Poeppel; Suzanne Dikker
Journal:  J Cogn Neurosci       Date:  2018-04-30       Impact factor: 3.225

6.  The role of visual attention in saccadic eye movements.

Authors:  J E Hoffman; B Subramaniam
Journal:  Percept Psychophys       Date:  1995-08

7.  A theory of reading: from eye fixations to comprehension.

Authors:  M A Just; P A Carpenter
Journal:  Psychol Rev       Date:  1980-07       Impact factor: 8.934

8.  Attentional synchrony and the influence of viewing task on gaze behavior in static and dynamic scenes.

Authors:  Tim J Smith; Parag K Mital
Journal:  J Vis       Date:  2013-07-17       Impact factor: 2.240

9.  EEG in the classroom: Synchronised neural recordings during video presentation.

Authors:  Andreas Trier Poulsen; Simon Kamronn; Jacek Dmochowski; Lucas C Parra; Lars Kai Hansen
Journal:  Sci Rep       Date:  2017-03-07       Impact factor: 4.379

10.  Audience preferences are predicted by temporal reliability of neural processing.

Authors:  Jacek P Dmochowski; Matthew A Bezdek; Brian P Abelson; John S Johnson; Eric H Schumacher; Lucas C Parra
Journal:  Nat Commun       Date:  2014-07-29       Impact factor: 14.919

View more
  1 in total

1.  Glimpse: A Gaze-Based Measure of Temporal Salience.

Authors:  V Javier Traver; Judith Zorío; Luis A Leiva
Journal:  Sensors (Basel)       Date:  2021-04-29       Impact factor: 3.576

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.