| Literature DB >> 21543595 |
Leah M M McGuire1, Philip N Sabes.
Abstract
The planning and control of sensory-guided movements requires the integration of multiple sensory streams. Although the information conveyed by different sensory modalities is often overlapping, the shared information is represented differently across modalities during the early stages of cortical processing. We ask how these diverse sensory signals are represented in multimodal sensorimotor areas of cortex in macaque monkeys. Although a common modality-independent representation might facilitate downstream readout, previous studies have found that modality-specific representations in multimodal cortex reflect upstream spatial representations. For example, visual signals have a more eye-centered representation. We recorded neural activity from two parietal areas involved in reach planning, area 5 and the medial intraparietal area (MIP), as animals reached to visual, combined visual and proprioceptive, and proprioceptive targets while fixing their gaze on another location. In contrast to other multimodal cortical areas, the same spatial representations are used to represent visual and proprioceptive signals in both area 5 and MIP. However, these representations are heterogeneous. Although we observed a posterior-to-anterior gradient in population responses in parietal cortex, from more eye-centered to more hand- or body-centered representations, we do not observe the simple and discrete reference frame representations suggested by studies that focused on identifying the "best-match" reference frame for a given cortical area. In summary, we find modality-independent representations of spatial information in parietal cortex, although these representations are complex and heterogeneous.Entities:
Mesh:
Year: 2011 PMID: 21543595 PMCID: PMC3100795 DOI: 10.1523/JNEUROSCI.2921-10.2011
Source DB: PubMed Journal: J Neurosci ISSN: 0270-6474 Impact factor: 6.167