Literature DB >> 30883169

Recognition of incidentally learned visual search arrays is supported by fixational eye movements.

Efsun Annac1, Mathias Pointner1, Patrick H Khader1, Hermann J Müller1, Xuelian Zang1, Thomas Geyer1.   

Abstract

Repeated encounter of abstract target-distractor letter arrangements leads to improved visual search for such displays. This contextual-cueing effect is attributed to incidental learning of display configurations. Whether observers can consciously access the memory underlying the cueing effect is still a controversial issue. The current study uses a novel recognition task and eyetracking to tackle this question. Experiment 1 investigated observers' ability to recognize or "generate" the display quadrant of the target in a previous search array when the target was now substituted by distractor element as well as where observers' eye fixations would fall while they freely viewed the recognition display, examining the link between the fixation pattern and explicit recognition judgments. Experiment 2 tested whether eye fixations would serve a critical role for explicit retrieval from context memory. Experiment 3 asked whether eye fixations of the target region are critical for context-based facilitation of search reaction times to manifest. The results revealed longer fixational dwell times in the target quadrant for learned relative to foil displays. Further, explicit recognition was enhanced, and above chance level, when observers were made to fixate the target quadrant as compared to when they were prevented from doing so. However, the manifestation of contextual cueing of visual search did itself not require fixations of the target quadrant. Moreover, contextual-cueing of search reaction times was significantly correlated with both fixational dwell times and observers' explicit generation performance. The results argue in favor of contextual cueing of visual search being the result of a single, explicit, memory system, though it could nevertheless receive support from separable-automatic versus controlled-retrieval processes. Fixational eye movements, that is, the directed overt allocation of visual attention, provide an interface between these processes in context cueing. (PsycINFO Database Record (c) 2019 APA, all rights reserved).

Entities:  

Mesh:

Year:  2019        PMID: 30883169     DOI: 10.1037/xlm0000702

Source DB:  PubMed          Journal:  J Exp Psychol Learn Mem Cogn        ISSN: 0278-7393            Impact factor:   3.051


  6 in total

1.  Contextual cueing in co-active visual search: Joint action allows acquisition of task-irrelevant context.

Authors:  Xuelian Zang; Artyom Zinchenko; Jiao Wu; Xiuna Zhu; Fang Fang; Zhuanghua Shi
Journal:  Atten Percept Psychophys       Date:  2022-04-18       Impact factor: 2.199

2.  Crossmodal learning of target-context associations: When would tactile context predict visual search?

Authors:  Siyi Chen; Zhuanghua Shi; Xuelian Zang; Xiuna Zhu; Leonardo Assumpção; Hermann J Müller; Thomas Geyer
Journal:  Atten Percept Psychophys       Date:  2020-05       Impact factor: 2.199

3.  Influences of luminance contrast and ambient lighting on visual context learning and retrieval.

Authors:  Xuelian Zang; Lingyun Huang; Xiuna Zhu; Hermann J Müller; Zhuanghua Shi
Journal:  Atten Percept Psychophys       Date:  2020-11       Impact factor: 2.199

4.  Contextual Cueing Accelerated and Enhanced by Monetary Reward: Evidence From Event-Related Brain Potentials.

Authors:  Guang Zhao; Qian Zhuang; Jie Ma; Shen Tu; Shiyi Li
Journal:  Front Hum Neurosci       Date:  2021-04-15       Impact factor: 3.169

5.  Real-world object categories and scene contexts conjointly structure statistical learning for the guidance of visual search.

Authors:  Ariel M Kershner; Andrew Hollingworth
Journal:  Atten Percept Psychophys       Date:  2022-04-14       Impact factor: 2.157

6.  Task-based memory systems in contextual-cueing of visual search and explicit recognition.

Authors:  Thomas Geyer; Pardis Rostami; Lisa Sogerer; Bernhard Schlagbauer; Hermann J Müller
Journal:  Sci Rep       Date:  2020-10-05       Impact factor: 4.379

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.