Literature DB >> 11814488

Multisensory spatial representations in eye-centered coordinates for reaching.

Alexandre Pouget1, Jean Christophe Ducom, Jeffrey Torri, Daphne Bavelier.   

Abstract

Humans can reach for objects with their hands whether the objects are seen, heard or touched. Thus, the position of objects is recoded in a joint-centered frame of reference regardless of the sensory modality involved. Our study indicates that this frame of reference is not the only one shared across sensory modalities. The location of reaching targets is also encoded in eye-centered coordinates, whether the targets are visual, auditory, proprioceptive or imaginary. Furthermore, the remembered eye-centered location is updated after each eye and head movement. This is quite surprising since, in principle, a reaching motor command can be computed from any non-visual modality without ever recovering the eye-centered location of the stimulus. This finding may reflect the predominant role of vision in human spatial perception.

Entities:  

Mesh:

Year:  2002        PMID: 11814488     DOI: 10.1016/s0010-0277(01)00163-9

Source DB:  PubMed          Journal:  Cognition        ISSN: 0010-0277


  46 in total

1.  Geometric computations underlying eye-hand coordination: orientations of the two eyes and the head.

Authors:  D Y P Henriques; W P Medendorp; C C A M Gielen; J D Crawford
Journal:  Exp Brain Res       Date:  2003-06-26       Impact factor: 1.972

2.  Interaction between gaze and visual and proprioceptive position judgements.

Authors:  Katja Fiehler; Frank Rösler; Denise Y P Henriques
Journal:  Exp Brain Res       Date:  2010-04-29       Impact factor: 1.972

3.  Touch used to guide action is partially coded in a visual reference frame.

Authors:  Vanessa Harrar; Laurence R Harris
Journal:  Exp Brain Res       Date:  2010-04-29       Impact factor: 1.972

4.  Effects of hand termination and accuracy constraint on eye-hand coordination during sequential two-segment movements.

Authors:  Miya K Rand; George E Stelmach
Journal:  Exp Brain Res       Date:  2010-10-22       Impact factor: 1.972

5.  Crossing the hands is more confusing for females than males.

Authors:  Michelle L Cadieux; Michael Barnett-Cowan; David I Shore
Journal:  Exp Brain Res       Date:  2010-06-24       Impact factor: 1.972

6.  Integration of anatomical and external response mappings explains crossing effects in tactile localization: A probabilistic modeling approach.

Authors:  Stephanie Badde; Tobias Heed; Brigitte Röder
Journal:  Psychon Bull Rev       Date:  2016-04

7.  Visual motion due to eye movements helps guide the hand.

Authors:  David Whitney; Melvyn A Goodale
Journal:  Exp Brain Res       Date:  2005-01-15       Impact factor: 1.972

8.  An object-centred reference frame for control of grasping: effects of grasping a distractor object on visuomotor control.

Authors:  Sandhiran Patchay; Patrick Haggard; Umberto Castiello
Journal:  Exp Brain Res       Date:  2005-11-23       Impact factor: 1.972

9.  Why does the brain predict sensory consequences of oculomotor commands? Optimal integration of the predicted and the actual sensory feedback.

Authors:  Siavash Vaziri; Jörn Diedrichsen; Reza Shadmehr
Journal:  J Neurosci       Date:  2006-04-19       Impact factor: 6.167

10.  Development of kinesthetic-motor and auditory-motor representations in school-aged children.

Authors:  Florian A Kagerer; Jane E Clark
Journal:  Exp Brain Res       Date:  2015-04-26       Impact factor: 1.972

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.