Literature DB >> 22506754

Fixation-dependent memory for natural scenes: an experimental test of scanpath theory.

Tom Foulsham1, Alan Kingstone.   

Abstract

Many modern theories propose that perceptual information is represented by the sensorimotor activity elicited by the original stimulus. Scanpath theory (Noton & Stark, 1971) predicts that reinstating a sequence of eye fixations will help an observer recognize a previously seen image. However, the only studies to investigate this are correlational ones based on calculating scanpath similarity. We therefore describe a series of 5 experiments that constrain the fixations during encoding or recognition of images in order to manipulate scanpath similarity. Participants encoded a set of images and later had to recognize those that they had seen. They spontaneously selected regions that they had fixated during encoding (Experiment 1), and this was a predictor of recognition accuracy. Yoking the parts of the image available at recognition to the encoded scanpath led to better memory performance than randomly selected image regions (Experiment 2), and this could not be explained by the spatial distribution of locations (Experiment 3). However, there was no recognition advantage for re-viewing one's own fixations versus someone else's (Experiment 4) or for retaining their serial order (Experiment 5). Therefore, although it is beneficial to look at encoded regions, there is no evidence that scanpaths are stored or that scanpath recapitulation is functional in scene memory. This paradigm provides a controlled way of studying the integration of scene content, spatial structure, and oculomotor signals, with consequences for the perception, representation, and retrieval of visual information. 2013 APA, all rights reserved

Entities:  

Mesh:

Year:  2012        PMID: 22506754     DOI: 10.1037/a0028227

Source DB:  PubMed          Journal:  J Exp Psychol Gen        ISSN: 0022-1015


  19 in total

1.  Using space to represent categories: insights from gaze position.

Authors:  Corinna S Martarelli; Sandra Chiquet; Bruno Laeng; Fred W Mast
Journal:  Psychol Res       Date:  2016-06-15

2.  The spatial distribution of attention predicts familiarity strength during encoding and retrieval.

Authors:  Michelle M Ramey; John M Henderson; Andrew P Yonelinas
Journal:  J Exp Psychol Gen       Date:  2020-04-06

3.  Listen up, eye movements play a role in verbal memory retrieval.

Authors:  Agnes Scholz; Katja Mehlhorn; Josef F Krems
Journal:  Psychol Res       Date:  2014-12-20

4.  Schema-related eye movements support episodic simulation.

Authors:  Jordana S Wynn; Ruben D I Van Genugten; Signy Sheldon; Daniel L Schacter
Journal:  Conscious Cogn       Date:  2022-02-28

5.  Eye-movement replay supports episodic remembering.

Authors:  Roger Johansson; Marcus Nyström; Richard Dewhurst; Mikael Johansson
Journal:  Proc Biol Sci       Date:  2022-06-15       Impact factor: 5.530

6.  Temporal context guides visual exploration during scene recognition.

Authors:  James E Kragel; Joel L Voss
Journal:  J Exp Psychol Gen       Date:  2020-09-24

7.  Eye movements provide an index of veridical memory for temporal order.

Authors:  Thanujeni Pathman; Simona Ghetti
Journal:  PLoS One       Date:  2015-05-20       Impact factor: 3.240

8.  Drawing from memory: hand-eye coordination at multiple scales.

Authors:  Stephanie Huette; Christopher T Kello; Theo Rhodes; Michael J Spivey
Journal:  PLoS One       Date:  2013-03-15       Impact factor: 3.240

9.  Tracking down the path of memory: eye scanpaths facilitate retrieval of visuospatial information.

Authors:  Agata Bochynska; Bruno Laeng
Journal:  Cogn Process       Date:  2015-09

10.  Using Highlighting to Train Attentional Expertise.

Authors:  Brett Roads; Michael C Mozer; Thomas A Busey
Journal:  PLoS One       Date:  2016-01-08       Impact factor: 3.240

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.