Literature DB >> 26348069

Classifying mental states from eye movements during scene viewing.

Omid Kardan1, Marc G Berman, Grigori Yourganov1, Joseph Schmidt1, John M Henderson2.   

Abstract

How eye movements reflect underlying cognitive processes during scene viewing has been a topic of considerable theoretical interest. In this study, we used eye-movement features and their distributions over time to successfully classify mental states as indexed by the behavioral task performed by participants. We recorded eye movements from 72 participants performing 3 scene-viewing tasks: visual search, scene memorization, and aesthetic preference. To classify these tasks, we used statistical features (mean, standard deviation, and skewness) of fixation durations and saccade amplitudes, as well as the total number of fixations. The same set of visual stimuli was used in all tasks to exclude the possibility that different salient scene features influenced eye movements across tasks. All of the tested classification algorithms were successful in predicting the task within a single participant. The linear discriminant algorithm was also successful in predicting the task for each participant when the training data came from other participants, suggesting some generalizability across participants. The number of fixations contributed most to task classification; however, the remaining features and, in particular, their covariance provided important task-specific information. These results provide evidence on how participants perform different visual tasks. In the visual search task, for example, participants exhibited more variance and skewness in fixation durations and saccade amplitudes, but also showed heightened correlation between fixation durations and the variance in fixation durations. In summary, these results point to the possibility that eye-movement features and their distributional properties can be used to classify mental states both within and across individuals. (c) 2015 APA, all rights reserved).

Entities:  

Mesh:

Year:  2015        PMID: 26348069     DOI: 10.1037/a0039673

Source DB:  PubMed          Journal:  J Exp Psychol Hum Percept Perform        ISSN: 0096-1523            Impact factor:   3.332


  15 in total

1.  Decoding working memory content from attentional biases.

Authors:  Emma Wu Dowd; John M Pearson; Tobias Egner
Journal:  Psychon Bull Rev       Date:  2017-08

2.  A Generative Model of Cognitive State from Task and Eye Movements.

Authors:  W Joseph MacInnes; Amelia R Hunt; Alasdair D F Clarke; Michael D Dodd
Journal:  Cognit Comput       Date:  2018-05-09       Impact factor: 5.418

3.  The spatial distribution of attention predicts familiarity strength during encoding and retrieval.

Authors:  Michelle M Ramey; John M Henderson; Andrew P Yonelinas
Journal:  J Exp Psychol Gen       Date:  2020-04-06

4.  Modeling Eye Movements During Decision Making: A Review.

Authors:  Michel Wedel; Rik Pieters; Ralf van der Lans
Journal:  Psychometrika       Date:  2022-07-19       Impact factor: 2.290

5.  Predicting eye-movement characteristics across multiple tasks from working memory and executive control.

Authors:  Steven G Luke; Emily S Darowski; Shawn D Gale
Journal:  Mem Cognit       Date:  2018-07

6.  Scanpath modeling and classification with hidden Markov models.

Authors:  Antoine Coutrot; Janet H Hsiao; Antoni B Chan
Journal:  Behav Res Methods       Date:  2018-02

7.  Why do we retrace our visual steps? Semantic and episodic memory in gaze reinstatement.

Authors:  Michelle M Ramey; Andrew P Yonelinas; John M Henderson
Journal:  Learn Mem       Date:  2020-06-15       Impact factor: 2.460

Review 8.  Eye Movements in Medical Image Perception: A Selective Review of Past, Present and Future.

Authors:  Chia-Chien Wu; Jeremy M Wolfe
Journal:  Vision (Basel)       Date:  2019-06-20

9.  Cultural and Developmental Influences on Overt Visual Attention to Videos.

Authors:  Omid Kardan; Laura Shneidman; Sheila Krogh-Jespersen; Suzanne Gaskins; Marc G Berman; Amanda Woodward
Journal:  Sci Rep       Date:  2017-09-12       Impact factor: 4.379

10.  Semantic content outweighs low-level saliency in determining children's and adults' fixation of movies.

Authors:  Andrew T Rider; Antoine Coutrot; Elizabeth Pellicano; Steven C Dakin; Isabelle Mareschal
Journal:  J Exp Child Psychol       Date:  2017-09-30
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.