Literature DB >> 29720551

Behavioral and Neural Representations of Spatial Directions across Words, Schemas, and Images.

Steven M Weisberg1, Steven A Marchette2, Anjan Chatterjee3.   

Abstract

Modern spatial navigation requires fluency with multiple representational formats, including visual scenes, signs, and words. These formats convey different information. Visual scenes are rich and specific but contain extraneous details. Arrows, as an example of signs, are schematic representations in which the extraneous details are eliminated, but analog spatial properties are preserved. Words eliminate all spatial information and convey spatial directions in a purely abstract form. How does the human brain compute spatial directions within and across these formats? To investigate this question, we conducted two experiments on men and women: a behavioral study that was preregistered and a neuroimaging study using multivoxel pattern analysis of fMRI data to uncover similarities and differences among representational formats. Participants in the behavioral study viewed spatial directions presented as images, schemas, or words (e.g., "left"), and responded to each trial, indicating whether the spatial direction was the same or different as the one viewed previously. They responded more quickly to schemas and words than images, despite the visual complexity of stimuli being matched. Participants in the fMRI study performed the same task but responded only to occasional catch trials. Spatial directions in images were decodable in the intraparietal sulcus bilaterally but were not in schemas and words. Spatial directions were also decodable between all three formats. These results suggest that intraparietal sulcus plays a role in calculating spatial directions in visual scenes, but this neural circuitry may be bypassed when the spatial directions are presented as schemas or words.SIGNIFICANCE STATEMENT Human navigators encounter spatial directions in various formats: words ("turn left"), schematic signs (an arrow showing a left turn), and visual scenes (a road turning left). The brain must transform these spatial directions into a plan for action. Here, we investigate similarities and differences between neural representations of these formats. We found that bilateral intraparietal sulci represent spatial directions in visual scenes and across the three formats. We also found that participants respond quickest to schemas, then words, then images, suggesting that spatial directions in abstract formats are easier to interpret than concrete formats. These results support a model of spatial direction interpretation in which spatial directions are either computed for real world action or computed for efficient visual comparison.
Copyright © 2018 the authors 0270-6474/18/384996-12$15.00/0.

Entities:  

Keywords:  MVPA; fMRI; intraparietal suclus; navigation; representational similarity analysis; spatial directions

Mesh:

Year:  2018        PMID: 29720551      PMCID: PMC5966795          DOI: 10.1523/JNEUROSCI.3250-17.2018

Source DB:  PubMed          Journal:  J Neurosci        ISSN: 0270-6474            Impact factor:   6.167


  32 in total

1.  Viewpoint-specific scene representations in human parahippocampal cortex.

Authors:  Russell Epstein; Kim S Graham; Paul E Downing
Journal:  Neuron       Date:  2003-03-06       Impact factor: 17.173

2.  Improved optimization for the robust and accurate linear registration and motion correction of brain images.

Authors:  Mark Jenkinson; Peter Bannister; Michael Brady; Stephen Smith
Journal:  Neuroimage       Date:  2002-10       Impact factor: 6.556

3.  Coding of navigational affordances in the human visual system.

Authors:  Michael F Bonner; Russell A Epstein
Journal:  Proc Natl Acad Sci U S A       Date:  2017-04-17       Impact factor: 11.205

4.  Parietal cortex codes for egocentric space beyond the field of view.

Authors:  Andreas Schindler; Andreas Bartels
Journal:  Curr Biol       Date:  2012-12-20       Impact factor: 10.834

5.  Probabilistic Maps of Visual Topography in Human Cortex.

Authors:  Liang Wang; Ryan E B Mruczek; Michael J Arcaro; Sabine Kastner
Journal:  Cereb Cortex       Date:  2014-12-01       Impact factor: 5.357

6.  Distributed and overlapping representations of faces and objects in ventral temporal cortex.

Authors:  J V Haxby; M I Gobbini; M L Furey; A Ishai; J L Schouten; P Pietrini
Journal:  Science       Date:  2001-09-28       Impact factor: 47.728

7.  Neural bases of action abstraction.

Authors:  Lorna C Quandt; Yune-Sang Lee; Anjan Chatterjee
Journal:  Biol Psychol       Date:  2017-09-28       Impact factor: 3.251

Review 8.  FSL.

Authors:  Mark Jenkinson; Christian F Beckmann; Timothy E J Behrens; Mark W Woolrich; Stephen M Smith
Journal:  Neuroimage       Date:  2011-09-16       Impact factor: 6.556

9.  How vision and movement combine in the hippocampal place code.

Authors:  Guifen Chen; John A King; Neil Burgess; John O'Keefe
Journal:  Proc Natl Acad Sci U S A       Date:  2012-12-19       Impact factor: 11.205

10.  Language, perception, and the schematic representation of spatial relations.

Authors:  Prin Amorapanth; Alexander Kranjec; Bianca Bromberger; Matthew Lehet; Page Widick; Adam J Woods; Daniel Y Kimberg; Anjan Chatterjee
Journal:  Brain Lang       Date:  2011-11-08       Impact factor: 2.781

View more
  1 in total

1.  Spatial direction comprehension in images, arrows, and words in two patients with posterior cortical atrophy.

Authors:  Steven M Weisberg; Anjan Chatterjee
Journal:  Neuropsychologia       Date:  2020-12-03       Impact factor: 3.139

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.