Literature DB >> 18484855

Preferential responses to occluded objects in the human visual cortex.

Jay Hegdé1, Fang Fang, Scott O Murray, Daniel Kersten.   

Abstract

How do we see an object when it is partially obstructed from view? The neural mechanisms of this intriguing process are unclear, in part because studies of visual object perception heretofore have largely used stimuli of individual objects, such as faces or common inanimate objects, each presented alone. But in natural images, visual objects are typically occluded by other objects. Computational studies indicate that the perception of an occluded object requires processes that are substantially different from those for an unoccluded object in plain view. We studied the neural substrates of the perception of occluded objects using functional magnetic resonance imaging (fMRI) of human subjects viewing stimuli that were designed to elicit or not elicit the percept of an occluded object but were physically very similar. We hypothesized the regions that are selective for occluded objects, if they exist, will be differentially active during the two conditions. We found two regions, one in the ventral object processing pathway and another in the dorsal object processing pathway, that were significantly responsive to occluded objects. More importantly, both regions were significantly more responsive to occluded objects than to unoccluded objects, and this enhanced response was not attributable to low-level differences in the stimuli, amodal completion per se, or the behavioral task. Our results identify regions in the visual cortex that are preferentially responsive to occluded objects relative to other stimuli tested and indicate that these regions are likely to play an important role in the perception of occluded objects.

Entities:  

Mesh:

Year:  2008        PMID: 18484855     DOI: 10.1167/8.4.16

Source DB:  PubMed          Journal:  J Vis        ISSN: 1534-7362            Impact factor:   2.240


  14 in total

1.  Female túngara frogs do not experience the continuity illusion.

Authors:  Alexander T Baugh; Michael J Ryan; Ximena E Bernal; A Stanley Rand; Mark A Bee
Journal:  Behav Neurosci       Date:  2015-12-21       Impact factor: 1.912

2.  A link between visual disambiguation and visual memory.

Authors:  Jay Hegdé; Daniel Kersten
Journal:  J Neurosci       Date:  2010-11-10       Impact factor: 6.167

3.  Amodal completion and relationalism.

Authors:  Bence Nanay
Journal:  Philos Stud       Date:  2022-04-28

4.  Object recognition in clutter: cortical responses depend on the type of learning.

Authors:  Jay Hegdé; Serena K Thompson; Mark Brady; Daniel Kersten
Journal:  Front Hum Neurosci       Date:  2012-06-19       Impact factor: 3.169

Review 5.  A new taxonomy for perceptual filling-in.

Authors:  Rimona S Weil; Geraint Rees
Journal:  Brain Res Rev       Date:  2010-11-05

6.  Facial Expression Aftereffect Revealed by Adaption to Emotion-Invisible Dynamic Bubbled Faces.

Authors:  Chengwen Luo; Qingyun Wang; Philippe G Schyns; Frederick A A Kingdom; Hong Xu
Journal:  PLoS One       Date:  2015-12-30       Impact factor: 3.240

7.  Pre-Cueing Effects: Attention or Mental Imagery?

Authors:  Peter Fazekas; Bence Nanay
Journal:  Front Psychol       Date:  2017-03-06

8.  The representation of object distance: evidence from neuroimaging and neuropsychology.

Authors:  Marian E Berryhill; Ingrid R Olson
Journal:  Front Hum Neurosci       Date:  2009-11-11       Impact factor: 3.169

9.  The Time Course of Perceptual Closure of Incomplete Visual Objects: An Event-Related Potential Study.

Authors:  Chenyang Liu; Sha Sha; Xiujun Zhang; Zhiming Bian; Lin Lu; Bin Hao; Lina Li; Hongge Luo; Xiaotian Wang; Changming Wang; Chao Chen
Journal:  Comput Intell Neurosci       Date:  2020-10-06

10.  A critical role of holistic processing in face gender perception.

Authors:  Takemasa Yokoyama; Yasuki Noguchi; Ryosuke Tachibana; Shigeru Mukaida; Shinichi Kita
Journal:  Front Hum Neurosci       Date:  2014-06-26       Impact factor: 3.169

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.