Literature DB >> 27668308

You think you know where you looked? You better look again.

Melissa L-H Võ1, Avigael M Aizenman2, Jeremy M Wolfe2.   

Abstract

People are surprisingly bad at knowing where they have looked in a scene. We tested participants' ability to recall their own eye movements in 2 experiments using natural or artificial scenes. In each experiment, participants performed a change-detection (Exp.1) or search (Exp.2) task. On 25% of trials, after 3 seconds of viewing the scene, participants were asked to indicate where they thought they had just fixated. They responded by making mouse clicks on 12 locations in the unchanged scene. After 135 trials, observers saw 10 new scenes and were asked to put 12 clicks where they thought someone else would have looked. Although observers located their own fixations more successfully than a random model, their performance was no better than when they were guessing someone else's fixations. Performance with artificial scenes was worse, though judging one's own fixations was slightly superior. Even after repeating the fixation-location task on 30 scenes immediately after scene viewing, performance was far from the prediction of an ideal observer. Memory for our own fixation locations appears to add next to nothing beyond what common sense tells us about the likely fixations of others. These results have important implications for socially important visual search tasks. For example, a radiologist might think he has looked at "everything" in an image, but eye tracking data suggest that this is not so. Such shortcomings might be avoided by providing observers with better insights of where they have looked. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

Entities:  

Mesh:

Year:  2016        PMID: 27668308      PMCID: PMC5079107          DOI: 10.1037/xhp0000264

Source DB:  PubMed          Journal:  J Exp Psychol Hum Percept Perform        ISSN: 0096-1523            Impact factor:   3.332


  7 in total

1.  Ocular proprioception and efference copy in registering visual direction.

Authors:  B Bridgeman; L Stark
Journal:  Vision Res       Date:  1991       Impact factor: 1.886

2.  The Psychophysics Toolbox.

Authors:  D H Brainard
Journal:  Spat Vis       Date:  1997

3.  The VideoToolbox software for visual psychophysics: transforming numbers into movies.

Authors:  D G Pelli
Journal:  Spat Vis       Date:  1997

4.  Scanners and drillers: characterizing expert visual search through volumetric images.

Authors:  Trafton Drew; Melissa Le-Hoa Vo; Alex Olwal; Francine Jacobson; Steven E Seltzer; Jeremy M Wolfe
Journal:  J Vis       Date:  2013-08-06       Impact factor: 2.240

5.  Visual scanning, pattern recognition and decision-making in pulmonary nodule detection.

Authors:  H L Kundel; C F Nodine; D Carmody
Journal:  Invest Radiol       Date:  1978 May-Jun       Impact factor: 6.016

6.  Subjective report of eye fixations during serial search.

Authors:  Sébastien Marti; Laurie Bayet; Stanislas Dehaene
Journal:  Conscious Cogn       Date:  2014-12-08

7.  Where have eye been? Observers can recognise their own fixations.

Authors:  Tom Foulsham; Alan Kingstone
Journal:  Perception       Date:  2013       Impact factor: 1.490

  7 in total
  21 in total

1.  Comparing search patterns in digital breast tomosynthesis and full-field digital mammography: an eye tracking study.

Authors:  Avi Aizenman; Trafton Drew; Krista A Ehinger; Dianne Georgian-Smith; Jeremy M Wolfe
Journal:  J Med Imaging (Bellingham)       Date:  2017-10-27

Review 2.  Normal blindness: when we Look But Fail To See.

Authors:  Jeremy M Wolfe; Anna Kosovicheva; Benjamin Wolfe
Journal:  Trends Cogn Sci       Date:  2022-07-21       Impact factor: 24.482

3.  This is a test: Oculomotor capture when the experiment keeps score.

Authors:  Brian A Anderson; Lana Mrkonja
Journal:  Atten Percept Psychophys       Date:  2022-08-02       Impact factor: 2.157

4.  What eye tracking can tell us about how radiologists use automated breast ultrasound.

Authors:  Jeremy M Wolfe; Wanyi Lyu; Jeffrey Dong; Chia-Chien Wu
Journal:  J Med Imaging (Bellingham)       Date:  2022-07-26

5.  Characteristics of expert search behavior in volumetric medical image interpretation.

Authors:  Lauren H Williams; Ann J Carrigan; Megan Mills; William F Auffermann; Anina N Rich; Trafton Drew
Journal:  J Med Imaging (Bellingham)       Date:  2021-07-14

6.  What Am I Looking at? Interpreting Dynamic and Static Gaze Displays.

Authors:  Margot van Wermeskerken; Damien Litchfield; Tamara van Gog
Journal:  Cogn Sci       Date:  2017-03-13

7.  What We Do and Do Not Know about Teaching Medical Image Interpretation.

Authors:  Ellen M Kok; Koos van Geel; Jeroen J G van Merriënboer; Simon G F Robben
Journal:  Front Psychol       Date:  2017-03-03

8.  Using aversive conditioning with near-real-time feedback to shape eye movements during naturalistic viewing.

Authors:  Brian A Anderson
Journal:  Behav Res Methods       Date:  2020-09-11

9.  Even if I showed you where you looked, remembering where you just looked is hard.

Authors:  Ellen M Kok; Avi M Aizenman; Melissa L-H Võ; Jeremy M Wolfe
Journal:  J Vis       Date:  2017-10-01       Impact factor: 2.240

10.  Simple eye-movement feedback during visual search is not helpful.

Authors:  Trafton Drew; Lauren H Williams
Journal:  Cogn Res Princ Implic       Date:  2017-11-22
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.