Literature DB >> 31699889

Computational Mechanisms for Perceptual Stability using Disparity and Motion Parallax.

Oliver W Layton1, Brett R Fajen2.   

Abstract

Walking and other forms of self-motion create global motion patterns across our eyes. With the resulting stream of visual signals, how do we perceive ourselves as moving through a stable world? Although the neural mechanisms are largely unknown, human studies (Warren and Rushton, 2009) provide strong evidence that the visual system is capable of parsing the global motion into two components: one due to self-motion and the other due to independently moving objects. In the present study, we use computational modeling to investigate potential neural mechanisms for stabilizing visual perception during self-motion that build on neurophysiology of the middle temporal (MT) and medial superior temporal (MST) areas. One such mechanism leverages direction, speed, and disparity tuning of cells in dorsal MST (MSTd) to estimate the combined motion parallax and disparity signals attributed to the observer's self-motion. Feedback from the most active MSTd cell subpopulations suppresses motion signals in MT that locally match the preference of the MSTd cell in both parallax and disparity. This mechanism combined with local surround inhibition in MT allows the model to estimate self-motion while maintaining a sparse motion representation that is compatible with perceptual stability. A key consequence is that after signals compatible with the observer's self-motion are suppressed, the direction of independently moving objects is represented in a world-relative rather than observer-relative reference frame. Our analysis explicates how temporal dynamics and joint motion parallax-disparity tuning resolve the world-relative motion of moving objects and establish perceptual stability. Together, these mechanisms capture findings on the perception of object motion during self-motion.SIGNIFICANCE STATEMENT The image integrated by our eyes as we move through our environment undergoes constant flux as trees, buildings, and other surroundings stream by us. If our view can change so radically from one moment to the next, how do we perceive a stable world? Although progress has been made in understanding how this works, little is known about the underlying brain mechanisms. We propose a computational solution whereby multiple brain areas communicate to suppress the motion attributed to our movement relative to the stationary world, which is often responsible for a large proportion of the flux across the visual field. We simulated the proposed neural mechanisms and tested model estimates using data from human perceptual studies.
Copyright © 2020 the authors.

Entities:  

Keywords:  MSTd; MT; motion; object motion; optic flow; self-motion

Year:  2019        PMID: 31699889      PMCID: PMC6989005          DOI: 10.1523/JNEUROSCI.0036-19.2019

Source DB:  PubMed          Journal:  J Neurosci        ISSN: 0270-6474            Impact factor:   6.167


  58 in total

1.  Coding of horizontal disparity and velocity by MT neurons in the alert macaque.

Authors:  Gregory C DeAngelis; Takanori Uka
Journal:  J Neurophysiol       Date:  2003-02       Impact factor: 2.714

2.  A motion pooling model of visually guided navigation explains human behavior in the presence of independently moving objects.

Authors:  Oliver W Layton; Ennio Mingolla; N Andrew Browning
Journal:  J Vis       Date:  2012-01-24       Impact factor: 2.240

3.  Segregation of global and local motion processing in primate middle temporal visual area.

Authors:  R T Born; R B Tootell
Journal:  Nature       Date:  1992-06-11       Impact factor: 49.962

4.  Diverse suppressive influences in area MT and selectivity to complex motion features.

Authors:  Yuwei Cui; Liu D Liu; Farhan A Khawaja; Christopher C Pack; Daniel A Butts
Journal:  J Neurosci       Date:  2013-10-16       Impact factor: 6.167

Review 5.  The free-energy principle: a unified brain theory?

Authors:  Karl Friston
Journal:  Nat Rev Neurosci       Date:  2010-01-13       Impact factor: 34.870

6.  Selectivity of macaque MT/V5 neurons for surface orientation in depth specified by motion.

Authors:  D K Xiao; V L Marcar; S E Raiguel; G A Orban
Journal:  Eur J Neurosci       Date:  1997-05       Impact factor: 3.386

7.  Center-surround interactions in the middle temporal visual area of the owl monkey.

Authors:  R T Born
Journal:  J Neurophysiol       Date:  2000-11       Impact factor: 2.714

8.  Cortical dynamics of navigation and steering in natural scenes: Motion-based object segmentation, heading, and obstacle avoidance.

Authors:  N Andrew Browning; Stephen Grossberg; Ennio Mingolla
Journal:  Neural Netw       Date:  2009-05-23

9.  Subtractive, divisive and non-monotonic gain control in feedforward nets linearized by noise and delays.

Authors:  Jorge F Mejias; Alexandre Payeur; Erik Selin; Leonard Maler; André Longtin
Journal:  Front Comput Neurosci       Date:  2014-02-25       Impact factor: 2.380

10.  Competitive Dynamics in MSTd: A Mechanism for Robust Heading Perception Based on Optic Flow.

Authors:  Oliver W Layton; Brett R Fajen
Journal:  PLoS Comput Biol       Date:  2016-06-24       Impact factor: 4.475

View more
  3 in total

1.  A neural mechanism for detecting object motion during self-motion.

Authors:  HyungGoo R Kim; Dora E Angelaki; Gregory C DeAngelis
Journal:  Elife       Date:  2022-06-01       Impact factor: 8.713

2.  ARTFLOW: A Fast, Biologically Inspired Neural Network that Learns Optic Flow Templates for Self-Motion Estimation.

Authors:  Oliver W Layton
Journal:  Sensors (Basel)       Date:  2021-12-08       Impact factor: 3.576

3.  Distributed encoding of curvilinear self-motion across spiral optic flow patterns.

Authors:  Oliver W Layton; Brett R Fajen
Journal:  Sci Rep       Date:  2022-08-04       Impact factor: 4.996

  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.