Literature DB >> 32499378

Selective Enhancement of Object Representations through Multisensory Integration.

David A Tovar1,2, Micah M Murray3,4,5,6, Mark T Wallace7,2,6,8,9,10.   

Abstract

Objects are the fundamental building blocks of how we create a representation of the external world. One major distinction among objects is between those that are animate versus those that are inanimate. In addition, many objects are specified by more than a single sense, yet the nature by which multisensory objects are represented by the brain remains poorly understood. Using representational similarity analysis of male and female human EEG signals, we show enhanced encoding of audiovisual objects when compared with their corresponding visual and auditory objects. Surprisingly, we discovered that the often-found processing advantages for animate objects were not evident under multisensory conditions. This was due to a greater neural enhancement of inanimate objects-which are more weakly encoded under unisensory conditions. Further analysis showed that the selective enhancement of inanimate audiovisual objects corresponded with an increase in shared representations across brain areas, suggesting that the enhancement was mediated by multisensory integration. Moreover, a distance-to-bound analysis provided critical links between neural findings and behavior. Improvements in neural decoding at the individual exemplar level for audiovisual inanimate objects predicted reaction time differences between multisensory and unisensory presentations during a Go/No-Go animate categorization task. Links between neural activity and behavioral measures were most evident at intervals of 100-200 ms and 350-500 ms after stimulus presentation, corresponding to time periods associated with sensory evidence accumulation and decision-making, respectively. Collectively, these findings provide key insights into a fundamental process the brain uses to maximize the information it captures across sensory systems to perform object recognition.SIGNIFICANCE STATEMENT Our world is filled with ever-changing sensory information that we are able to seamlessly transform into a coherent and meaningful perceptual experience. We accomplish this feat by combining different stimulus features into objects. However, despite the fact that these features span multiple senses, little is known about how the brain combines the various forms of sensory information into object representations. Here, we used EEG and machine learning to study how the brain processes auditory, visual, and audiovisual objects. Surprisingly, we found that nonliving (i.e., inanimate) objects, which are more difficult to process with one sense alone, benefited the most from engaging multiple senses.
Copyright © 2020 the authors.

Entities:  

Keywords:  EEG; decoding; multisensory integration; object recognition; representational similarity analysis

Mesh:

Year:  2020        PMID: 32499378      PMCID: PMC7363464          DOI: 10.1523/JNEUROSCI.2139-19.2020

Source DB:  PubMed          Journal:  J Neurosci        ISSN: 0270-6474            Impact factor:   6.167


  48 in total

1.  Category-specific naming errors in normal subjects: the influence of evolution and experience.

Authors:  K R Laws
Journal:  Brain Lang       Date:  2000-10-15       Impact factor: 2.381

2.  The fur of the crocodile and the mooing sheep: A study of a patient with a category-specific impairment for biological things.

Authors:  Regine Kolinsky; Patrick Fery; Diana Messina; Isabelle Peretz; Sylvie Evinck; Paulo Ventura; Jose Morais
Journal:  Cogn Neuropsychol       Date:  2002-06-01       Impact factor: 2.468

3.  The dog's meow: asymmetrical interaction in cross-modal object recognition.

Authors:  Shlomit Yuval-Greenberg; Leon Y Deouell
Journal:  Exp Brain Res       Date:  2008-12-06       Impact factor: 1.972

4.  The fusiform face area: a module in human extrastriate cortex specialized for face perception.

Authors:  N Kanwisher; J McDermott; M M Chun
Journal:  J Neurosci       Date:  1997-06-01       Impact factor: 6.167

Review 5.  Decoding Dynamic Brain Patterns from Evoked Responses: A Tutorial on Multivariate Pattern Analysis Applied to Time Series Neuroimaging Data.

Authors:  Tijl Grootswagers; Susan G Wardle; Thomas A Carlson
Journal:  J Cogn Neurosci       Date:  2016-10-25       Impact factor: 3.225

Review 6.  Music agnosia and auditory agnosia. Dissociations in stroke patients.

Authors:  Luigi A Vignolo
Journal:  Ann N Y Acad Sci       Date:  2003-11       Impact factor: 5.691

7.  Auditory agnosia.

Authors:  L A Vignolo
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  1982-06-25       Impact factor: 6.237

8.  Categories of knowledge. Further fractionations and an attempted integration.

Authors:  E K Warrington; R A McCarthy
Journal:  Brain       Date:  1987-10       Impact factor: 13.501

9.  CoSMoMVPA: Multi-Modal Multivariate Pattern Analysis of Neuroimaging Data in Matlab/GNU Octave.

Authors:  Nikolaas N Oosterhof; Andrew C Connolly; James V Haxby
Journal:  Front Neuroinform       Date:  2016-07-22       Impact factor: 4.081

10.  Effective connectivity during animacy perception--dynamic causal modelling of Human Connectome Project data.

Authors:  Hauke Hillebrandt; Karl J Friston; Sarah-Jayne Blakemore
Journal:  Sci Rep       Date:  2014-09-01       Impact factor: 4.379

View more
  2 in total

1.  Stimulus Feature-Specific Information Flow Along the Columnar Cortical Microcircuit Revealed by Multivariate Laminar Spiking Analysis.

Authors:  David A Tovar; Jacob A Westerberg; Michele A Cox; Kacie Dougherty; Thomas A Carlson; Mark T Wallace; Alexander Maier
Journal:  Front Syst Neurosci       Date:  2020-11-30

Review 2.  Visual Influences on Auditory Behavioral, Neural, and Perceptual Processes: A Review.

Authors:  Collins Opoku-Baah; Adriana M Schoenhaut; Sarah G Vassall; David A Tovar; Ramnarayan Ramachandran; Mark T Wallace
Journal:  J Assoc Res Otolaryngol       Date:  2021-05-20
  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.