Literature DB >> 17461684

Where to look next? Eye movements reduce local uncertainty.

Laura Walker Renninger1, Preeti Verghese, James Coughlan.   

Abstract

How do we decide where to look next? During natural, active vision, we move our eyes to gather task-relevant information from the visual scene. Information theory provides an elegant framework for investigating how visual stimulus information combines with prior knowledge and task goals to plan an eye movement. We measured eye movements as observers performed a shape-learning and -matching task, for which the task-relevant information was tightly controlled. Using computational models, we probe the underlying strategies used by observers when planning their next eye movement. One strategy is to move the eyes to locations that maximize the total information gained about the shape, which is equivalent to reducing global uncertainty. Observers' behavior may appear highly similar to this strategy, but a rigorous analysis of sequential fixation placement reveals that observers may instead be using a local rule: fixate only the most informative locations, that is, reduce local uncertainty.

Mesh:

Year:  2007        PMID: 17461684     DOI: 10.1167/7.3.6

Source DB:  PubMed          Journal:  J Vis        ISSN: 1534-7362            Impact factor:   2.240


  51 in total

1.  Dynamic integration of information about salience and value for saccadic eye movements.

Authors:  Alexander C Schütz; Julia Trommershäuser; Karl R Gegenfurtner
Journal:  Proc Natl Acad Sci U S A       Date:  2012-04-23       Impact factor: 11.205

2.  Choice of saccade endpoint under risk.

Authors:  John F Ackermann; Michael S Landy
Journal:  J Vis       Date:  2013-09-10       Impact factor: 2.240

3.  Image statistics of the environment surrounding freely behaving hoverflies.

Authors:  Olga Dyakova; Martin M Müller; Martin Egelhaaf; Karin Nordström
Journal:  J Comp Physiol A Neuroethol Sens Neural Behav Physiol       Date:  2019-04-01       Impact factor: 1.836

4.  Action-effect associations revealed by eye movements.

Authors:  Arvid Herwig; Gernot Horstmann
Journal:  Psychon Bull Rev       Date:  2011-06

Review 5.  Eye movements: the past 25 years.

Authors:  Eileen Kowler
Journal:  Vision Res       Date:  2011-01-13       Impact factor: 1.886

6.  Modeling peripheral visual acuity enables discovery of gaze strategies at multiple time scales during natural scene search.

Authors:  Pavan Ramkumar; Hugo Fernandes; Konrad Kording; Mark Segraves
Journal:  J Vis       Date:  2015-03-26       Impact factor: 2.240

7.  Modeling Search for People in 900 Scenes: A combined source model of eye guidance.

Authors:  Krista A Ehinger; Barbara Hidalgo-Sotelo; Antonio Torralba; Aude Oliva
Journal:  Vis cogn       Date:  2009-08-01

Review 8.  HOW DO RADIOLOGISTS USE THE HUMAN SEARCH ENGINE?

Authors:  Jeremy M Wolfe; Karla K Evans; Trafton Drew; Avigael Aizenman; Emilie Josephs
Journal:  Radiat Prot Dosimetry       Date:  2015-12-08       Impact factor: 0.972

9.  Evolution and optimality of similar neural mechanisms for perception and action during search.

Authors:  Sheng Zhang; Miguel P Eckstein
Journal:  PLoS Comput Biol       Date:  2010-09-09       Impact factor: 4.475

10.  Gambling in the visual periphery: a conjoint-measurement analysis of human ability to judge visual uncertainty.

Authors:  Hang Zhang; Camille Morvan; Laurence T Maloney
Journal:  PLoS Comput Biol       Date:  2010-12-02       Impact factor: 4.475

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.