Literature DB >> 22367585

Inverse effectiveness and multisensory interactions in visual event-related potentials with audiovisual speech.

Ryan A Stevenson1, Maxim Bushmakin, Sunah Kim, Mark T Wallace, Aina Puce, Thomas W James.   

Abstract

In recent years, it has become evident that neural responses previously considered to be unisensory can be modulated by sensory input from other modalities. In this regard, visual neural activity elicited to viewing a face is strongly influenced by concurrent incoming auditory information, particularly speech. Here, we applied an additive-factors paradigm aimed at quantifying the impact that auditory speech has on visual event-related potentials (ERPs) elicited to visual speech. These multisensory interactions were measured across parametrically varied stimulus salience, quantified in terms of signal to noise, to provide novel insights into the neural mechanisms of audiovisual speech perception. First, we measured a monotonic increase of the amplitude of the visual P1-N1-P2 ERP complex during a spoken-word recognition task with increases in stimulus salience. ERP component amplitudes varied directly with stimulus salience for visual, audiovisual, and summed unisensory recordings. Second, we measured changes in multisensory gain across salience levels. During audiovisual speech, the P1 and P1-N1 components exhibited less multisensory gain relative to the summed unisensory components with reduced salience, while N1-P2 amplitude exhibited greater multisensory gain as salience was reduced, consistent with the principle of inverse effectiveness. The amplitude interactions were correlated with behavioral measures of multisensory gain across salience levels as measured by response times, suggesting that change in multisensory gain associated with unisensory salience modulations reflects an increased efficiency of visual speech processing.

Entities:  

Mesh:

Year:  2012        PMID: 22367585      PMCID: PMC3789520          DOI: 10.1007/s10548-012-0220-7

Source DB:  PubMed          Journal:  Brain Topogr        ISSN: 0896-0267            Impact factor:   3.020


  98 in total

1.  Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex.

Authors:  G A Calvert; R Campbell; M J Brammer
Journal:  Curr Biol       Date:  2000-06-01       Impact factor: 10.834

2.  Bimodal speech: early suppressive visual effects in human auditory cortex.

Authors:  Julien Besle; Alexandra Fort; Claude Delpuech; Marie-Hélène Giard
Journal:  Eur J Neurosci       Date:  2004-10       Impact factor: 3.386

3.  When audition alters vision: an event-related potential study of the cross-modal interactions between faces and voices.

Authors:  F Joassin; P Maurage; R Bruyer; M Crommelinck; S Campanella
Journal:  Neurosci Lett       Date:  2004-10-14       Impact factor: 3.046

4.  Repetition-induced changes in BOLD response reflect accumulation of neural activity.

Authors:  Thomas W James; Isabel Gauthier
Journal:  Hum Brain Mapp       Date:  2006-01       Impact factor: 5.038

5.  Neural responses elicited to face motion and vocalization pairings.

Authors:  Aina Puce; James A Epling; James C Thompson; Olivia K Carrick
Journal:  Neuropsychologia       Date:  2007-01-07       Impact factor: 3.139

6.  Detection of audio-visual integration sites in humans by application of electrophysiological criteria to the BOLD effect.

Authors:  G A Calvert; P C Hansen; S D Iversen; M J Brammer
Journal:  Neuroimage       Date:  2001-08       Impact factor: 6.556

7.  Preperceptual and stimulus-selective enhancement of low-level human visual cortex excitability by sounds.

Authors:  Vincenzo Romei; Micah M Murray; Céline Cappe; Gregor Thut
Journal:  Curr Biol       Date:  2009-10-15       Impact factor: 10.834

8.  Top-down and bottom-up modulation in processing bimodal face/voice stimuli.

Authors:  Marianne Latinus; Rufin VanRullen; Margot J Taylor
Journal:  BMC Neurosci       Date:  2010-03-11       Impact factor: 3.288

Review 9.  Multisensory connections of monkey auditory cerebral cortex.

Authors:  John F Smiley; Arnaud Falchier
Journal:  Hear Res       Date:  2009-07-18       Impact factor: 3.208

10.  Sound alters activity in human V1 in association with illusory visual perception.

Authors:  S Watkins; L Shams; S Tanaka; J-D Haynes; G Rees
Journal:  Neuroimage       Date:  2006-03-23       Impact factor: 6.556

View more
  22 in total

1.  EEG gamma-band activity during audiovisual speech comprehension in different noise environments.

Authors:  Yanfei Lin; Baolin Liu; Zhiwen Liu; Xiaorong Gao
Journal:  Cogn Neurodyn       Date:  2015-02-22       Impact factor: 5.082

2.  Multisensory speech perception in autism spectrum disorder: From phoneme to whole-word perception.

Authors:  Ryan A Stevenson; Sarah H Baum; Magali Segers; Susanne Ferber; Morgan D Barense; Mark T Wallace
Journal:  Autism Res       Date:  2017-03-24       Impact factor: 5.216

3.  Perceptual and categorical decision making: goal-relevant representation of two domains at different levels of abstraction.

Authors:  Swetha Shankar; Andrew S Kayser
Journal:  J Neurophysiol       Date:  2017-03-01       Impact factor: 2.714

4.  Links between temporal acuity and multisensory integration across life span.

Authors:  Ryan A Stevenson; Sarah H Baum; Juliane Krueger; Paul A Newhouse; Mark T Wallace
Journal:  J Exp Psychol Hum Percept Perform       Date:  2017-04-27       Impact factor: 3.332

5.  Stimulus intensity modulates multisensory temporal processing.

Authors:  Juliane Krueger Fister; Ryan A Stevenson; Aaron R Nidiffer; Zachary P Barnett; Mark T Wallace
Journal:  Neuropsychologia       Date:  2016-02-23       Impact factor: 3.139

Review 6.  The construct of the multisensory temporal binding window and its dysregulation in developmental disabilities.

Authors:  Mark T Wallace; Ryan A Stevenson
Journal:  Neuropsychologia       Date:  2014-08-13       Impact factor: 3.139

7.  Response Errors in Females' and Males' Sentence Lipreading Necessitate Structurally Different Models for Predicting Lipreading Accuracy.

Authors:  Lynne E Bernstein
Journal:  Lang Learn       Date:  2018-02-26

8.  Neural networks supporting audiovisual integration for speech: A large-scale lesion study.

Authors:  Gregory Hickok; Corianne Rogalsky; William Matchin; Alexandra Basilakos; Julia Cai; Sara Pillay; Michelle Ferrill; Soren Mickelsen; Steven W Anderson; Tracy Love; Jeffrey Binder; Julius Fridriksson
Journal:  Cortex       Date:  2018-04-10       Impact factor: 4.027

9.  Deficits in audiovisual speech perception in normal aging emerge at the level of whole-word recognition.

Authors:  Ryan A Stevenson; Caitlin E Nelms; Sarah H Baum; Lilia Zurkovsky; Morgan D Barense; Paul A Newhouse; Mark T Wallace
Journal:  Neurobiol Aging       Date:  2014-08-07       Impact factor: 4.673

10.  Interactions between space and effectiveness in human multisensory performance.

Authors:  Aaron R Nidiffer; Ryan A Stevenson; Juliane Krueger Fister; Zachary P Barnett; Mark T Wallace
Journal:  Neuropsychologia       Date:  2016-01-27       Impact factor: 3.139

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.