Literature DB >> 18318632

What can saliency models predict about eye movements? Spatial and sequential aspects of fixations during encoding and recognition.

Tom Foulsham1, Geoffrey Underwood.   

Abstract

Saliency map models account for a small but significant amount of the variance in where people fixate, but evaluating these models with natural stimuli has led to mixed results. In the present study, the eye movements of participants were recorded while they viewed color photographs of natural scenes in preparation for a memory test (encoding) and when recognizing them later. These eye movements were then compared to the predictions of a well defined saliency map model (L. Itti & C. Koch, 2000), in terms of both individual fixation locations and fixation sequences (scanpaths). The saliency model is a significantly better predictor of fixation location than random models that take into account bias toward central fixations, and this is the case at both encoding and recognition. However, similarity between scanpaths made at multiple viewings of the same stimulus suggests that repetitive scanpaths also contribute to where people look. Top-down recapitulation of scanpaths is a key prediction of scanpath theory (D. Noton & L. Stark, 1971), but it might also be explained by bottom-up guidance. The present data suggest that saliency cannot account for scanpaths and that incorporating these sequences could improve model predictions.

Entities:  

Mesh:

Year:  2008        PMID: 18318632     DOI: 10.1167/8.2.6

Source DB:  PubMed          Journal:  J Vis        ISSN: 1534-7362            Impact factor:   2.240


  60 in total

1.  Parietal cortex integrates contextual and saliency signals during the encoding of natural scenes in working memory.

Authors:  Valerio Santangelo; Simona Arianna Di Francesco; Serena Mastroberardino; Emiliano Macaluso
Journal:  Hum Brain Mapp       Date:  2015-09-03       Impact factor: 5.038

2.  When do I quit? The search termination problem in visual search.

Authors:  Jeremy M Wolfe
Journal:  Nebr Symp Motiv       Date:  2012

Review 3.  Guidance of visual search by memory and knowledge.

Authors:  Andrew Hollingworth
Journal:  Nebr Symp Motiv       Date:  2012

4.  Influence of scene structure and content on visual search strategies.

Authors:  Tatiana A Amor; Mirko Luković; Hans J Herrmann; José S Andrade
Journal:  J R Soc Interface       Date:  2017-07       Impact factor: 4.118

5.  What do saliency models predict?

Authors:  Kathryn Koehler; Fei Guo; Sheng Zhang; Miguel P Eckstein
Journal:  J Vis       Date:  2014-03-11       Impact factor: 2.240

6.  Gaze behaviour during space perception and spatial decision making.

Authors:  Jan M Wiener; Christoph Hölscher; Simon Büchner; Lars Konieczny
Journal:  Psychol Res       Date:  2011-12-03

7.  Modeling Search for People in 900 Scenes: A combined source model of eye guidance.

Authors:  Krista A Ehinger; Barbara Hidalgo-Sotelo; Antonio Torralba; Aude Oliva
Journal:  Vis cogn       Date:  2009-08-01

8.  The role of peripheral vision in saccade planning: learning from people with tunnel vision.

Authors:  Gang Luo; Fernando Vargas-Martin; Eli Peli
Journal:  J Vis       Date:  2008-12-22       Impact factor: 2.240

9.  Everyone knows what is interesting: salient locations which should be fixated.

Authors:  Christopher Michael Masciocchi; Stefan Mihalas; Derrick Parkhurst; Ernst Niebur
Journal:  J Vis       Date:  2009-10-27       Impact factor: 2.240

10.  Individual differences in Scanpaths correspond with serotonin transporter genotype and behavioral phenotype in rhesus monkeys (Macaca mulatta).

Authors:  Robert R Gibboni; Prisca E Zimmerman; Katalin M Gothard
Journal:  Front Behav Neurosci       Date:  2009-11-16       Impact factor: 3.558

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.