Literature DB >> 27382164

Learning rational temporal eye movement strategies.

David Hoppe1, Constantin A Rothkopf2.   

Abstract

During active behavior humans redirect their gaze several times every second within the visual environment. Where we look within static images is highly efficient, as quantified by computational models of human gaze shifts in visual search and face recognition tasks. However, when we shift gaze is mostly unknown despite its fundamental importance for survival in a dynamic world. It has been suggested that during naturalistic visuomotor behavior gaze deployment is coordinated with task-relevant events, often predictive of future events, and studies in sportsmen suggest that timing of eye movements is learned. Here we establish that humans efficiently learn to adjust the timing of eye movements in response to environmental regularities when monitoring locations in the visual scene to detect probabilistically occurring events. To detect the events humans adopt strategies that can be understood through a computational model that includes perceptual and acting uncertainties, a minimal processing time, and, crucially, the intrinsic costs of gaze behavior. Thus, subjects traded off event detection rate with behavioral costs of carrying out eye movements. Remarkably, based on this rational bounded actor model the time course of learning the gaze strategies is fully explained by an optimal Bayesian learner with humans' characteristic uncertainty in time estimation, the well-known scalar law of biological timing. Taken together, these findings establish that the human visual system is highly efficient in learning temporal regularities in the environment and that it can use these regularities to control the timing of eye movements to detect behaviorally relevant events.

Entities:  

Keywords:  computational modeling; decision making; eye movements; learning; visual attention

Mesh:

Year:  2016        PMID: 27382164      PMCID: PMC4961172          DOI: 10.1073/pnas.1601305113

Source DB:  PubMed          Journal:  Proc Natl Acad Sci U S A        ISSN: 0027-8424            Impact factor:   11.205


  41 in total

1.  Dynamic integration of information about salience and value for saccadic eye movements.

Authors:  Alexander C Schütz; Julia Trommershäuser; Karl R Gegenfurtner
Journal:  Proc Natl Acad Sci U S A       Date:  2012-04-23       Impact factor: 11.205

2.  Task and context determine where you look.

Authors:  Constantin A Rothkopf; Dana H Ballard; Mary M Hayhoe
Journal:  J Vis       Date:  2007-12-19       Impact factor: 2.240

3.  What determines saccade timing in sequences of coordinated eye and hand movements?

Authors:  Roger W Remington; Shu-Chieh Wu; Harold Pashler
Journal:  Psychon Bull Rev       Date:  2011-06

4.  Optimal reward harvesting in complex perceptual environments.

Authors:  Vidhya Navalpakkam; Christof Koch; Antonio Rangel; Pietro Perona
Journal:  Proc Natl Acad Sci U S A       Date:  2010-03-01       Impact factor: 11.205

5.  Saccades to future ball location reveal memory-based prediction in a virtual-reality interception task.

Authors:  Gabriel Diaz; Joseph Cooper; Constantin Rothkopf; Mary Hayhoe
Journal:  J Vis       Date:  2013-01-16       Impact factor: 2.240

6.  Looking just below the eyes is optimal across face recognition tasks.

Authors:  Matthew F Peterson; Miguel P Eckstein
Journal:  Proc Natl Acad Sci U S A       Date:  2012-11-12       Impact factor: 11.205

7.  Active sensing in the categorization of visual patterns.

Authors:  Scott Cheng-Hsin Yang; Máté Lengyel; Daniel M Wolpert
Journal:  Elife       Date:  2016-02-10       Impact factor: 8.140

8.  Learning where to look for a hidden target.

Authors:  Leanne Chukoskie; Joseph Snider; Michael C Mozer; Richard J Krauzlis; Terrence J Sejnowski
Journal:  Proc Natl Acad Sci U S A       Date:  2013-06-10       Impact factor: 11.205

9.  Human visual search does not maximize the post-saccadic probability of identifying targets.

Authors:  Camille Morvan; Laurence T Maloney
Journal:  PLoS Comput Biol       Date:  2012-02-02       Impact factor: 4.475

10.  Perceptual decision making in less than 30 milliseconds.

Authors:  Terrence R Stanford; Swetha Shankar; Dino P Massoglia; M Gabriela Costello; Emilio Salinas
Journal:  Nat Neurosci       Date:  2010-01-24       Impact factor: 24.884

View more
  12 in total

Review 1.  Control of gaze in natural environments: effects of rewards and costs, uncertainty and memory in target selection.

Authors:  Mary M Hayhoe; Jonathan Samir Matthis
Journal:  Interface Focus       Date:  2018-06-15       Impact factor: 3.906

2.  Recentering bias for temporal saccades only: Evidence from binocular recordings of eye movements.

Authors:  Jérôme Tagu; Karine Doré-Mazars; Judith Vergne; Christelle Lemoine-Lardennois; Dorine Vergilino-Perez
Journal:  J Vis       Date:  2018-01-01       Impact factor: 2.240

3.  Humans quickly learn to blink strategically in response to environmental task demands.

Authors:  David Hoppe; Stefan Helfmann; Constantin A Rothkopf
Journal:  Proc Natl Acad Sci U S A       Date:  2018-02-14       Impact factor: 11.205

4.  Optimal policy for attention-modulated decisions explains human fixation behavior.

Authors:  Anthony I Jang; Ravi Sharma; Jan Drugowitsch
Journal:  Elife       Date:  2021-03-26       Impact factor: 8.140

5.  Gaze Behavior in a Natural Environment with a Task-Relevant Distractor: How the Presence of a Goalkeeper Distracts the Penalty Taker.

Authors:  Johannes Kurz; Mathias Hegele; Jörn Munzert
Journal:  Front Psychol       Date:  2018-01-26

6.  Intuitive physical reasoning about objects' masses transfers to a visuomotor decision task consistent with Newtonian physics.

Authors:  Nils Neupärtl; Fabian Tatai; Constantin A Rothkopf
Journal:  PLoS Comput Biol       Date:  2020-10-19       Impact factor: 4.475

7.  Children flexibly seek visual information to support signed and spoken language comprehension.

Authors:  Kyle MacDonald; Virginia A Marchman; Anne Fernald; Michael C Frank
Journal:  J Exp Psychol Gen       Date:  2019-11-21

8.  Davida Teller Award Lecture 2017: What can be learned from natural behavior?

Authors:  Mary M Hayhoe
Journal:  J Vis       Date:  2018-04-01       Impact factor: 2.240

9.  Multi-step planning of eye movements in visual search.

Authors:  David Hoppe; Constantin A Rothkopf
Journal:  Sci Rep       Date:  2019-01-15       Impact factor: 4.379

10.  Multiple processes independently predict motor learning.

Authors:  Christopher M Perry; Tarkeshwar Singh; Kayla G Springer; Adam T Harrison; Alexander C McLain; Troy M Herter
Journal:  J Neuroeng Rehabil       Date:  2020-11-17       Impact factor: 4.262

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.