Literature DB >> 19146319

Objects predict fixations better than early saliency.

Wolfgang Einhäuser1, Merrielle Spain, Pietro Perona.   

Abstract

Humans move their eyes while looking at scenes and pictures. Eye movements correlate with shifts in attention and are thought to be a consequence of optimal resource allocation for high-level tasks such as visual recognition. Models of attention, such as "saliency maps," are often built on the assumption that "early" features (color, contrast, orientation, motion, and so forth) drive attention directly. We explore an alternative hypothesis: Observers attend to "interesting" objects. To test this hypothesis, we measure the eye position of human observers while they inspect photographs of common natural scenes. Our observers perform different tasks: artistic evaluation, analysis of content, and search. Immediately after each presentation, our observers are asked to name objects they saw. Weighted with recall frequency, these objects predict fixations in individual images better than early saliency, irrespective of task. Also, saliency combined with object positions predicts which objects are frequently named. This suggests that early saliency has only an indirect effect on attention, acting through recognized objects. Consequently, rather than treating attention as mere preprocessing step for object recognition, models of both need to be integrated.

Entities:  

Mesh:

Year:  2008        PMID: 19146319     DOI: 10.1167/8.14.18

Source DB:  PubMed          Journal:  J Vis        ISSN: 1534-7362            Impact factor:   2.240


  77 in total

1.  The attraction of visual attention to texts in real-world scenes.

Authors:  Hsueh-Cheng Wang; Marc Pomplun
Journal:  J Vis       Date:  2012-06-19       Impact factor: 2.240

2.  Contingent capture in cueing: the role of color search templates and cue-target color relations.

Authors:  Ulrich Ansorge; Stefanie I Becker
Journal:  Psychol Res       Date:  2013-06-27

3.  Measuring and Predicting Object Importance.

Authors:  Merrielle Spain; Pietro Perona
Journal:  Int J Comput Vis       Date:  2010-08-27       Impact factor: 7.410

Review 4.  Eye movements: the past 25 years.

Authors:  Eileen Kowler
Journal:  Vision Res       Date:  2011-01-13       Impact factor: 1.886

5.  When do I quit? The search termination problem in visual search.

Authors:  Jeremy M Wolfe
Journal:  Nebr Symp Motiv       Date:  2012

Review 6.  Guidance of visual search by memory and knowledge.

Authors:  Andrew Hollingworth
Journal:  Nebr Symp Motiv       Date:  2012

7.  Task-Irrelevant Visual Forms Facilitate Covert and Overt Spatial Selection.

Authors:  Amarender R Bogadhi; Antimo Buonocore; Ziad M Hafed
Journal:  J Neurosci       Date:  2020-10-30       Impact factor: 6.167

8.  Curveball: A tool for rapid measurement of contrast sensitivity based on smooth eye movements.

Authors:  Scott W J Mooney; N Jeremy Hill; Melis S Tuzun; Nazia M Alam; Jason B Carmel; Glen T Prusky
Journal:  J Vis       Date:  2018-11-01       Impact factor: 2.240

9.  Eye movement prediction and variability on natural video data sets.

Authors:  Michael Dorr; Eleonora Vig; Erhardt Barth
Journal:  Vis cogn       Date:  2012-03-26

10.  Influence of low-level stimulus features, task dependent factors, and spatial biases on overt visual attention.

Authors:  Sepp Kollmorgen; Nora Nortmann; Sylvia Schröder; Peter König
Journal:  PLoS Comput Biol       Date:  2010-05-20       Impact factor: 4.475

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.