Literature DB >> 9565031

Encoding of three-dimensional structure-from-motion by primate area MT neurons.

D C Bradley1, G C Chang, R A Andersen.   

Abstract

We see the world as three-dimensional, but because the retinal image is flat, we must derive the third dimension, depth, from two-dimensional cues. Image movement provides one of the most potent cues for depth. For example, the shadow of a contorted wire appears flat when the wire is stationary, but rotating the wire causes motion in the shadow, which suddenly appears three-dimensional. The neural mechanism of this effect, known as 'structure-from-motion', has not been discovered. Here we study cortical area MT, a primate region that is involved in visual motion perception. Two rhesus monkeys were trained to fixate their gaze while viewing two-dimensional projections of transparent, revolving cylinders. These stimuli appear to be three-dimensional, but the surface order perceived (front as opposed to back) tends to reverse spontaneously. These reversals occur because the stimulus does not specify which surface is in front or at the back. Monkeys reported which surface order they perceived after viewing the stimulus. In many of the neurons tested, there was a reproducible change in activity that coincided with reversals of the perceived surface order, even though the stimulus remained identical. This suggests that area MT has a basic role in structure-from-motion perception.

Entities:  

Mesh:

Year:  1998        PMID: 9565031     DOI: 10.1038/33688

Source DB:  PubMed          Journal:  Nature        ISSN: 0028-0836            Impact factor:   49.962


  69 in total

1.  Perceptually bistable three-dimensional figures evoke high choice probabilities in cortical area MT.

Authors:  J V Dodd; K Krug; B G Cumming; A J Parker
Journal:  J Neurosci       Date:  2001-07-01       Impact factor: 6.167

2.  Contribution of middle temporal area to coarse depth discrimination: comparison of neuronal and psychophysical sensitivity.

Authors:  Takanori Uka; Gregory C DeAngelis
Journal:  J Neurosci       Date:  2003-04-15       Impact factor: 6.167

3.  Early behavior of optokinetic responses elicited by transparent motion stimuli during depth-based attention.

Authors:  Masaki Maruyama; Tetsuo Kobayashi; Takusige Katsura; Shinya Kuriki
Journal:  Exp Brain Res       Date:  2003-06-13       Impact factor: 1.972

4.  Neuronal activity and its links with the perception of multi-stable figures.

Authors:  Andrew J Parker; Kristine Krug; Bruce G Cumming
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2002-08-29       Impact factor: 6.237

Review 5.  A common neuronal code for perceptual processes in visual cortex? Comparing choice and attentional correlates in V5/MT.

Authors:  Kristine Krug
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2004-06-29       Impact factor: 6.237

Review 6.  Early computational processing in binocular vision and depth perception.

Authors:  Jenny Read
Journal:  Prog Biophys Mol Biol       Date:  2005-01       Impact factor: 3.667

7.  Representation of 3-D surface orientation by velocity and disparity gradient cues in area MT.

Authors:  Takahisa M Sanada; Jerry D Nguyenkim; Gregory C Deangelis
Journal:  J Neurophysiol       Date:  2012-01-04       Impact factor: 2.714

Review 8.  United we sense, divided we fail: context-driven perception of ambiguous visual stimuli.

Authors:  P C Klink; R J A van Wezel; R van Ee
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2012-04-05       Impact factor: 6.237

9.  Neural modulation by binocular disparity greatest in human dorsal visual stream.

Authors:  Loredana Minini; Andrew J Parker; Holly Bridge
Journal:  J Neurophysiol       Date:  2010-05-05       Impact factor: 2.714

10.  The selectivity of neurons in the macaque fundus of the superior temporal area for three-dimensional structure from motion.

Authors:  Santosh G Mysore; Rufin Vogels; Steven E Raiguel; James T Todd; Guy A Orban
Journal:  J Neurosci       Date:  2010-11-17       Impact factor: 6.167

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.