Literature DB >> 29465322

Learned rather than online relative weighting of visual-proprioceptive sensory cues.

Laura Mikula1,2, Valérie Gaveau1, Laure Pisella1, Aarlenne Z Khan2, Gunnar Blohm3.   

Abstract

When reaching to an object, information about the target location as well as the initial hand position is required to program the motor plan for the arm. The initial hand position can be determined by proprioceptive information as well as visual information, if available. Bayes-optimal integration posits that we utilize all information available, with greater weighting on the sense that is more reliable, thus generally weighting visual information more than the usually less reliable proprioceptive information. The criterion by which information is weighted has not been explicitly investigated; it has been assumed that the weights are based on task- and effector-dependent sensory reliability requiring an explicit neuronal representation of variability. However, the weights could also be determined implicitly through learned modality-specific integration weights and not on effector-dependent reliability. While the former hypothesis predicts different proprioceptive weights for left and right hands, e.g., due to different reliabilities of dominant vs. nondominant hand proprioception, we would expect the same integration weights if the latter hypothesis was true. We found that the proprioceptive weights for the left and right hands were extremely consistent regardless of differences in sensory variability for the two hands as measured in two separate complementary tasks. Thus we propose that proprioceptive weights during reaching are learned across both hands, with high interindividual range but independent of each hand's specific proprioceptive variability. NEW & NOTEWORTHY How visual and proprioceptive information about the hand are integrated to plan a reaching movement is still debated. The goal of this study was to clarify how the weights assigned to vision and proprioception during multisensory integration are determined. We found evidence that the integration weights are modality specific rather than based on the sensory reliabilities of the effectors.

Entities:  

Keywords:  multisensory integration; proprioception; reaching; vision

Mesh:

Year:  2018        PMID: 29465322      PMCID: PMC6008093          DOI: 10.1152/jn.00338.2017

Source DB:  PubMed          Journal:  J Neurophysiol        ISSN: 0022-3077            Impact factor:   2.714


  56 in total

1.  Ocular perturbations and retinal/extraretinal information: the coordination of saccadic and manual movements.

Authors:  G Binsted; D Elliott
Journal:  Exp Brain Res       Date:  1999-07       Impact factor: 1.972

2.  Bayesian integration in sensorimotor learning.

Authors:  Konrad P Körding; Daniel M Wolpert
Journal:  Nature       Date:  2004-01-15       Impact factor: 49.962

Review 3.  Merging the senses into a robust percept.

Authors:  Marc O Ernst; Heinrich H Bülthoff
Journal:  Trends Cogn Sci       Date:  2004-04       Impact factor: 20.229

Review 4.  Causal inference in perception.

Authors:  Ladan Shams; Ulrik R Beierholm
Journal:  Trends Cogn Sci       Date:  2010-08-11       Impact factor: 20.229

5.  The precision of proprioceptive position sense.

Authors:  R J van Beers; A C Sittig; J J Denier van der Gon
Journal:  Exp Brain Res       Date:  1998-10       Impact factor: 1.972

6.  Integration of visual and haptic informations in the perception of the vertical in young and old healthy adults and right brain-damaged patients.

Authors:  B Braem; J Honoré; M Rousseaux; A Saj; Y Coello
Journal:  Neurophysiol Clin       Date:  2013-11-06       Impact factor: 3.734

7.  Bimanual proprioception: are two hands better than one?

Authors:  Jeremy D Wong; Elizabeth T Wilson; Dinant A Kistemaker; Paul L Gribble
Journal:  J Neurophysiol       Date:  2013-12-31       Impact factor: 2.714

8.  Overconfidence in an objective anticipatory motor task.

Authors:  Pascal Mamassian
Journal:  Psychol Sci       Date:  2008-06

9.  Multi-sensory weights depend on contextual noise in reference frame transformations.

Authors:  Jessica Katherine Burns; Gunnar Blohm
Journal:  Front Hum Neurosci       Date:  2010-12-07       Impact factor: 3.169

10.  Testing whether humans have an accurate model of their own motor uncertainty in a speeded reaching task.

Authors:  Hang Zhang; Nathaniel D Daw; Laurence T Maloney
Journal:  PLoS Comput Biol       Date:  2013-05-23       Impact factor: 4.475

View more
  5 in total

1.  Neck muscle spindle noise biases reaches in a multisensory integration task.

Authors:  Parisa Abedi Khoozani; Gunnar Blohm
Journal:  J Neurophysiol       Date:  2018-05-09       Impact factor: 2.714

2.  Using the past to estimate sensory uncertainty.

Authors:  Ulrik Beierholm; Tim Rohe; Ambra Ferrari; Oliver Stegle; Uta Noppeney
Journal:  Elife       Date:  2020-12-15       Impact factor: 8.140

3.  Integration of allocentric and egocentric visual information in a convolutional/multilayer perceptron network model of goal-directed gaze shifts.

Authors:  Parisa Abedi Khoozani; Vishal Bharmauria; Adrian Schütz; Richard P Wildes; J Douglas Crawford
Journal:  Cereb Cortex Commun       Date:  2022-07-08

4.  Motor learning without moving: Proprioceptive and predictive hand localization after passive visuoproprioceptive discrepancy training.

Authors:  Ahmed A Mostafa; Bernard Marius 't Hart; Denise Y P Henriques
Journal:  PLoS One       Date:  2019-08-29       Impact factor: 3.240

5.  The effect of age on visuomotor learning processes.

Authors:  Chad Michael Vachon; Shanaathanan Modchalingam; Bernard Marius 't Hart; Denise Y P Henriques
Journal:  PLoS One       Date:  2020-09-14       Impact factor: 3.240

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.