Literature DB >> 21056048

Time course of allocentric decay, egocentric decay, and allocentric-to-egocentric conversion in memory-guided reach.

Ying Chen1, Patrick Byrne, J Douglas Crawford.   

Abstract

Allocentric cues can be used to encode locations in visuospatial memory, but it is not known how and when these representations are converted into egocentric commands for behaviour. Here, we tested the influence of different memory intervals on reach performance toward targets defined in either egocentric or allocentric coordinates, and then compared this to performance in a task where subjects were implicitly free to choose when to convert from allocentric to egocentric representations. Reach and eye positions were measured using Optotrak and Eyelink Systems, respectively, in fourteen subjects. Our results confirm that egocentric representations degrade over a delay of several seconds, whereas allocentric representations remained relatively stable over the same time scale. Moreover, when subjects were free to choose, they converted allocentric representations into egocentric representations as soon as possible, despite the apparent cost in reach precision in our experimental paradigm. This suggests that humans convert allocentric representations into egocentric commands at the first opportunity, perhaps to optimize motor noise and movement timing in real-world conditions.
Copyright © 2010 Elsevier Ltd. All rights reserved.

Entities:  

Mesh:

Year:  2010        PMID: 21056048     DOI: 10.1016/j.neuropsychologia.2010.10.031

Source DB:  PubMed          Journal:  Neuropsychologia        ISSN: 0028-3932            Impact factor:   3.139


  17 in total

1.  Fundamental limits on persistent activity in networks of noisy neurons.

Authors:  Yoram Burak; Ila R Fiete
Journal:  Proc Natl Acad Sci U S A       Date:  2012-10-09       Impact factor: 11.205

2.  Different "routes" to a cognitive map: dissociable forms of spatial knowledge derived from route and cartographic map learning.

Authors:  Hui Zhang; Ksenia Zherdeva; Arne D Ekstrom
Journal:  Mem Cognit       Date:  2014-10

3.  Evidence for distinct brain networks in the control of rule-based motor behavior.

Authors:  Joshua A Granek; Lauren E Sergio
Journal:  J Neurophysiol       Date:  2015-07-01       Impact factor: 2.714

4.  Integration of allocentric and egocentric visual information in a convolutional/multilayer perceptron network model of goal-directed gaze shifts.

Authors:  Parisa Abedi Khoozani; Vishal Bharmauria; Adrian Schütz; Richard P Wildes; J Douglas Crawford
Journal:  Cereb Cortex Commun       Date:  2022-07-08

5.  Frames of reference and categorical/coordinate spatial relations in a "what was where" task.

Authors:  Francesco Ruotolo; Tina Iachini; Gennaro Ruggiero; Ineke J M van der Ham; Albert Postma
Journal:  Exp Brain Res       Date:  2016-05-14       Impact factor: 1.972

6.  MAGELLAN: a cognitive map-based model of human wayfinding.

Authors:  Jeremy R Manning; Timothy F Lew; Ningcheng Li; Robert Sekuler; Michael J Kahana
Journal:  J Exp Psychol Gen       Date:  2014-02-03

7.  No effect of delay on the spatial representation of serial reach targets.

Authors:  Immo Schütz; Denise Y P Henriques; Katja Fiehler
Journal:  Exp Brain Res       Date:  2015-01-20       Impact factor: 1.972

8.  Hierarchical Integration of Communicative and Spatial Perspective-Taking Demands in Sensorimotor Control of Referential Pointing.

Authors:  Rui 睿 Liu 刘; Sara Bögels; Geoffrey Bird; W Pieter Medendorp; Ivan Toni
Journal:  Cogn Sci       Date:  2022-01

9.  Visual targets aren't irreversibly converted to motor coordinates: eye-centered updating of visuospatial memory in online reach control.

Authors:  Aidan A Thompson; Patrick A Byrne; Denise Y P Henriques
Journal:  PLoS One       Date:  2014-03-18       Impact factor: 3.240

10.  Spatial task context makes short-latency reaches prone to induced Roelofs illusion.

Authors:  Bahareh Taghizadeh; Alexander Gail
Journal:  Front Hum Neurosci       Date:  2014-08-29       Impact factor: 3.169

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.