Literature DB >> 36062138

Visual Search Asymmetry: Deep Nets and Humans Share Similar Inherent Biases.

Shashi Kant Gupta1, Mengmi Zhang2,3, Chia-Chien Wu4, Jeremy M Wolfe4, Gabriel Kreiman2,3.   

Abstract

Visual search is a ubiquitous and often challenging daily task, exemplified by looking for the car keys at home or a friend in a crowd. An intriguing property of some classical search tasks is an asymmetry such that finding a target A among distractors B can be easier than finding B among A. To elucidate the mechanisms responsible for asymmetry in visual search, we propose a computational model that takes a target and a search image as inputs and produces a sequence of eye movements until the target is found. The model integrates eccentricity-dependent visual recognition with target-dependent top-down cues. We compared the model against human behavior in six paradigmatic search tasks that show asymmetry in humans. Without prior exposure to the stimuli or task-specific training, the model provides a plausible mechanism for search asymmetry. We hypothesized that the polarity of search asymmetry arises from experience with the natural environment. We tested this hypothesis by training the model on augmented versions of ImageNet where the biases of natural images were either removed or reversed. The polarity of search asymmetry disappeared or was altered depending on the training protocol. This study highlights how classical perceptual properties can emerge in neural network models, without the need for task-specific training, but rather as a consequence of the statistical properties of the developmental diet fed to the model. All source code and data are publicly available at https://github.com/kreimanlab/VisualSearchAsymmetry.

Entities:  

Year:  2021        PMID: 36062138      PMCID: PMC9436507     

Source DB:  PubMed          Journal:  Adv Neural Inf Process Syst        ISSN: 1049-5258


  40 in total

Review 1.  Mechanisms of visual attention in the human cortex.

Authors:  S Kastner; L G Ungerleider
Journal:  Annu Rev Neurosci       Date:  2000       Impact factor: 12.449

2.  Attention activates winner-take-all competition among visual filters.

Authors:  D K Lee; L Itti; C Koch; J Braun
Journal:  Nat Neurosci       Date:  1999-04       Impact factor: 24.884

3.  Responses of neurons in macaque area V4 during memory-guided visual search.

Authors:  L Chelazzi; E K Miller; J Duncan; R Desimone
Journal:  Cereb Cortex       Date:  2001-08       Impact factor: 5.357

4.  Search asymmetries? What search asymmetries?

Authors:  R Rosenholtz
Journal:  Percept Psychophys       Date:  2001-04

Review 5.  Attentional modulation of visual processing.

Authors:  John H Reynolds; Leonardo Chelazzi
Journal:  Annu Rev Neurosci       Date:  2004       Impact factor: 12.449

Review 6.  The neural basis of visual attention.

Authors:  James W Bisley
Journal:  J Physiol       Date:  2010-08-31       Impact factor: 5.182

Review 7.  A quantitative theory of immediate visual recognition.

Authors:  Thomas Serre; Gabriel Kreiman; Minjoon Kouh; Charles Cadieu; Ulf Knoblich; Tomaso Poggio
Journal:  Prog Brain Res       Date:  2007       Impact factor: 2.453

8.  A Source for Feature-Based Attention in the Prefrontal Cortex.

Authors:  Narcisse P Bichot; Matthew T Heard; Ellen M DeGennaro; Robert Desimone
Journal:  Neuron       Date:  2015-11-08       Impact factor: 17.173

9.  Finding any Waldo with zero-shot invariant and efficient visual search.

Authors:  Mengmi Zhang; Jiashi Feng; Keng Teck Ma; Joo Hwee Lim; Qi Zhao; Gabriel Kreiman
Journal:  Nat Commun       Date:  2018-09-13       Impact factor: 14.919

10.  Number detectors spontaneously emerge in a deep neural network designed for visual object recognition.

Authors:  Khaled Nasr; Pooja Viswanathan; Andreas Nieder
Journal:  Sci Adv       Date:  2019-05-08       Impact factor: 14.136

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.