Literature DB >> 31410759

Mechanisms of contextual cueing: A tutorial review.

Caitlin A Sisk1, Roger W Remington2,3, Yuhong V Jiang2.   

Abstract

Repeated contexts yield faster response time in visual search, compared with novel contexts. This effect is known as contextual cueing. Despite extensive study over the past two decades, there remains a spirited debate over whether repeated displays expedite search before the target is found (early locus) or facilitate response after the target is found (late locus). Here, we provide a tutorial review of contextual cueing, with a focus on assessing the locus of the effect. We evaluate the evidence from psychophysics, EEG, and eye tracking. Existing studies support an early locus of contextual cueing, consistent with attentional guidance accounts. Evidence for a late locus exists, though it is less conclusive. Existing literature also highlights a distinction between habit-guided attention learned through experience and changes in spatial priority driven by task goals and stimulus salience.

Keywords:  Attention; Interactions with memory; Visual search,·Attention: selective

Mesh:

Year:  2019        PMID: 31410759     DOI: 10.3758/s13414-019-01832-2

Source DB:  PubMed          Journal:  Atten Percept Psychophys        ISSN: 1943-3921            Impact factor:   2.199


  10 in total

1.  Categorical cuing: Object categories structure the acquisition of statistical regularities to guide visual search.

Authors:  Brett Bahle; Ariel M Kershner; Andrew Hollingworth
Journal:  J Exp Psychol Gen       Date:  2021-04-08

Review 2.  Guided Search 6.0: An updated model of visual search.

Authors:  Jeremy M Wolfe
Journal:  Psychon Bull Rev       Date:  2021-02-05

3.  Crossmodal learning of target-context associations: When would tactile context predict visual search?

Authors:  Siyi Chen; Zhuanghua Shi; Xuelian Zang; Xiuna Zhu; Leonardo Assumpção; Hermann J Müller; Thomas Geyer
Journal:  Atten Percept Psychophys       Date:  2020-05       Impact factor: 2.199

4.  Visual object recognition is facilitated by temporal community structure.

Authors:  Ehsan Kakaei; Stepan Aleshin; Jochen Braun
Journal:  Learn Mem       Date:  2021-04-15       Impact factor: 2.460

5.  Location probability learning in 3-dimensional virtual search environments.

Authors:  Caitlin A Sisk; Victoria Interrante; Yuhong V Jiang
Journal:  Cogn Res Princ Implic       Date:  2021-03-24

6.  Multisensory visuo-tactile context learning enhances the guidance of unisensory visual search.

Authors:  Siyi Chen; Zhuanghua Shi; Hermann J Müller; Thomas Geyer
Journal:  Sci Rep       Date:  2021-05-03       Impact factor: 4.379

7.  Real-world object categories and scene contexts conjointly structure statistical learning for the guidance of visual search.

Authors:  Ariel M Kershner; Andrew Hollingworth
Journal:  Atten Percept Psychophys       Date:  2022-04-14       Impact factor: 2.157

8.  Task-Irrelevant Context Learned Under Rapid Display Presentation: Selective Attention in Associative Blocking.

Authors:  Xuelian Zang; Leonardo Assumpção; Jiao Wu; Xiaowei Xie; Artyom Zinchenko
Journal:  Front Psychol       Date:  2021-05-21

9.  Stimulus-driven updating of long-term context memories in visual search.

Authors:  Markus Conci; Martina Zellin
Journal:  Psychol Res       Date:  2021-01-26

10.  Effects of repeated testing in a pen-and-paper test of selective attention (FAIR-2).

Authors:  Bianca Wühr; Peter Wühr
Journal:  Psychol Res       Date:  2021-02-11
  10 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.