Literature DB >> 17005231

Real-world visual search is dominated by top-down guidance.

Xin Chen1, Gregory J Zelinsky.   

Abstract

How do bottom-up and top-down guidance signals combine to guide search behavior? Observers searched for a target either with or without a preview (top-down manipulation) or a color singleton (bottom-up manipulation) among the display objects. With a preview, reaction times were faster and more initial eye movements were guided to the target; the singleton failed to attract initial saccades under these conditions. Only in the absence of a preview did subjects preferentially fixate the color singleton. We conclude that the search for realistic objects is guided primarily by top-down control. Implications for saliency map models of visual search are discussed.

Mesh:

Year:  2006        PMID: 17005231     DOI: 10.1016/j.visres.2006.08.008

Source DB:  PubMed          Journal:  Vision Res        ISSN: 0042-6989            Impact factor:   1.886


  43 in total

1.  Search performance with discrete-cell stimulus arrays: filtered naturalistic images and probabilistic markers.

Authors:  Alan R Pinkus; Miriam J Poteet; Allan J Pantle
Journal:  Psychol Res       Date:  2012-04-03

Review 2.  A theory of eye movements during target acquisition.

Authors:  Gregory J Zelinsky
Journal:  Psychol Rev       Date:  2008-10       Impact factor: 8.934

3.  Modelling eye movements in a categorical search task.

Authors:  Gregory J Zelinsky; Hossein Adeli; Yifan Peng; Dimitris Samaras
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2013-09-09       Impact factor: 6.237

4.  Cat and mouse search: the influence of scene and object analysis on eye movements when targets change locations during search.

Authors:  Anne P Hillstrom; Joice D Segabinazi; Hayward J Godwin; Simon P Liversedge; Valerie Benson
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2017-01-02       Impact factor: 6.237

5.  Predictive activity in macaque frontal eye field neurons during natural scene searching.

Authors:  Adam N Phillips; Mark A Segraves
Journal:  J Neurophysiol       Date:  2009-12-16       Impact factor: 2.714

6.  Collaboration improves unspeeded search in the absence of precise target information.

Authors:  Alison Enright; Nathan Leggett; Jason S McCarley
Journal:  Atten Percept Psychophys       Date:  2020-10       Impact factor: 2.199

7.  A novel computational model to probe visual search deficits during motor performance.

Authors:  Tarkeshwar Singh; Julius Fridriksson; Christopher M Perry; Sarah C Tryon; Angela Ross; Stacy Fritz; Troy M Herter
Journal:  J Neurophysiol       Date:  2016-10-12       Impact factor: 2.714

8.  Modeling Search for People in 900 Scenes: A combined source model of eye guidance.

Authors:  Krista A Ehinger; Barbara Hidalgo-Sotelo; Antonio Torralba; Aude Oliva
Journal:  Vis cogn       Date:  2009-08-01

Review 9.  Using multidimensional scaling to quantify similarity in visual search and beyond.

Authors:  Michael C Hout; Hayward J Godwin; Gemma Fitzsimmons; Arryn Robbins; Tamaryn Menneer; Stephen D Goldinger
Journal:  Atten Percept Psychophys       Date:  2016-01       Impact factor: 2.199

10.  Influence of low-level stimulus features, task dependent factors, and spatial biases on overt visual attention.

Authors:  Sepp Kollmorgen; Nora Nortmann; Sylvia Schröder; Peter König
Journal:  PLoS Comput Biol       Date:  2010-05-20       Impact factor: 4.475

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.