Literature DB >> 24291340

Electrophysiological evidence for speech-specific audiovisual integration.

Martijn Baart1, Jeroen J Stekelenburg2, Jean Vroomen3.   

Abstract

Lip-read speech is integrated with heard speech at various neural levels. Here, we investigated the extent to which lip-read induced modulations of the auditory N1 and P2 (measured with EEG) are indicative of speech-specific audiovisual integration, and we explored to what extent the ERPs were modulated by phonetic audiovisual congruency. In order to disentangle speech-specific (phonetic) integration from non-speech integration, we used Sine-Wave Speech (SWS) that was perceived as speech by half of the participants (they were in speech-mode), while the other half was in non-speech mode. Results showed that the N1 obtained with audiovisual stimuli peaked earlier than the N1 evoked by auditory-only stimuli. This lip-read induced speeding up of the N1 occurred for listeners in speech and non-speech mode. In contrast, if listeners were in speech-mode, lip-read speech also modulated the auditory P2, but not if listeners were in non-speech mode, thus revealing speech-specific audiovisual binding. Comparing ERPs for phonetically congruent audiovisual stimuli with ERPs for incongruent stimuli revealed an effect of phonetic stimulus congruency that started at ~200 ms after (in)congruence became apparent. Critically, akin to the P2 suppression, congruency effects were only observed if listeners were in speech mode, and not if they were in non-speech mode. Using identical stimuli, we thus confirm that audiovisual binding involves (partially) different neural mechanisms for sound processing in speech and non-speech mode.
© 2013 Published by Elsevier Ltd.

Entities:  

Keywords:  Audiovisual integration; Audiovisual speech; N1; P2; Sine-wave speech

Mesh:

Year:  2013        PMID: 24291340     DOI: 10.1016/j.neuropsychologia.2013.11.011

Source DB:  PubMed          Journal:  Neuropsychologia        ISSN: 0028-3932            Impact factor:   3.139


  33 in total

1.  Developmental Shifts in Detection and Attention for Auditory, Visual, and Audiovisual Speech.

Authors:  Susan Jerger; Markus F Damian; Cassandra Karl; Hervé Abdi
Journal:  J Speech Lang Hear Res       Date:  2018-12-10       Impact factor: 2.297

2.  Phonetic matching of auditory and visual speech develops during childhood: evidence from sine-wave speech.

Authors:  Martijn Baart; Heather Bortfeld; Jean Vroomen
Journal:  J Exp Child Psychol       Date:  2014-09-23

3.  The role of emotion in dynamic audiovisual integration of faces and voices.

Authors:  Jenny Kokinous; Sonja A Kotz; Alessandro Tavano; Erich Schröger
Journal:  Soc Cogn Affect Neurosci       Date:  2014-08-20       Impact factor: 3.436

Review 4.  The COGs (context, object, and goals) in multisensory processing.

Authors:  Sanne ten Oever; Vincenzo Romei; Nienke van Atteveldt; Salvador Soto-Faraco; Micah M Murray; Pawel J Matusz
Journal:  Exp Brain Res       Date:  2016-03-01       Impact factor: 1.972

5.  Eye Can Hear Clearly Now: Inverse Effectiveness in Natural Audiovisual Speech Processing Relies on Long-Term Crossmodal Temporal Integration.

Authors:  Michael J Crosse; Giovanni M Di Liberto; Edmund C Lalor
Journal:  J Neurosci       Date:  2016-09-21       Impact factor: 6.167

6.  Neurophysiology underlying influence of stimulus reliability on audiovisual integration.

Authors:  Hannah Shatzer; Stanley Shen; Jess R Kerlin; Mark A Pitt; Antoine J Shahin
Journal:  Eur J Neurosci       Date:  2018-02-09       Impact factor: 3.386

7.  Electrophysiological evidence for a self-processing advantage during audiovisual speech integration.

Authors:  Avril Treille; Coriandre Vilain; Sonia Kandel; Marc Sato
Journal:  Exp Brain Res       Date:  2017-07-04       Impact factor: 1.972

8.  The impact of when, what and how predictions on auditory speech perception.

Authors:  Serge Pinto; Pascale Tremblay; Anahita Basirat; Marc Sato
Journal:  Exp Brain Res       Date:  2019-10-01       Impact factor: 1.972

9.  Fixating the eyes of a speaker provides sufficient visual information to modulate early auditory processing.

Authors:  Elina Kaplan; Alexandra Jesse
Journal:  Biol Psychol       Date:  2019-07-16       Impact factor: 3.251

10.  Electrophysiological correlates of individual differences in perception of audiovisual temporal asynchrony.

Authors:  Natalya Kaganovich; Jennifer Schumaker
Journal:  Neuropsychologia       Date:  2016-04-16       Impact factor: 3.139

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.