Literature DB >> 22387605

Visemic processing in audiovisual discrimination of natural speech: a simultaneous fMRI-EEG study.

Cyril Dubois1, Hélène Otzenberger, Daniel Gounot, Rudolph Sock, Marie-Noëlle Metz-Lutz.   

Abstract

In a noisy environment, visual perception of articulatory movements improves natural speech intelligibility. Parallel to phonemic processing based on auditory signal, visemic processing constitutes a counterpart based on "visemes", the distinctive visual units of speech. Aiming at investigating the neural substrates of visemic processing in a disturbed environment, we carried out a simultaneous fMRI-EEG experiment based on discriminating syllabic minimal pairs involving three phonological contrasts, each bearing on a single phonetic feature characterised by different degrees of visual distinctiveness. The contrasts involved either labialisation of the vowels, or place of articulation or voicing of the consonants. Audiovisual consonant-vowel syllable pairs were presented either with a static facial configuration or with a dynamic display of articulatory movements related to speech production. In the sound-disturbed MRI environment, the significant improvement of syllabic discrimination achieved in the dynamic audiovisual modality, compared to the static audiovisual modality was associated with activation of the occipito-temporal cortex (MT+V5) bilaterally, and of the left premotor cortex. While the former was activated in response to facial movements independently of their relation to speech, the latter was specifically activated by phonological discrimination. During fMRI, significant evoked potential responses to syllabic discrimination were recorded around 150 and 250 ms following the onset of the second stimulus of the pairs, whose amplitude was greater in the dynamic compared to the static audiovisual modality. Our results provide arguments for the involvement of the speech motor cortex in phonological discrimination, and suggest a multimodal representation of speech units.
Copyright © 2012 Elsevier Ltd. All rights reserved.

Entities:  

Mesh:

Substances:

Year:  2012        PMID: 22387605     DOI: 10.1016/j.neuropsychologia.2012.02.016

Source DB:  PubMed          Journal:  Neuropsychologia        ISSN: 0028-3932            Impact factor:   3.139


  4 in total

1.  Differences in interregional brain connectivity in children with unilateral hearing loss.

Authors:  Matthew E Jung; Miranda Colletta; Rebecca Coalson; Bradley L Schlaggar; Judith E C Lieu
Journal:  Laryngoscope       Date:  2017-04-20       Impact factor: 3.325

2.  Multisensory and modality specific processing of visual speech in different regions of the premotor cortex.

Authors:  Daniel E Callan; Jeffery A Jones; Akiko Callan
Journal:  Front Psychol       Date:  2014-05-05

3.  Multisensory integration of dynamic emotional faces and voices: method for simultaneous EEG-fMRI measurements.

Authors:  Patrick D Schelenz; Martin Klasen; Barbara Reese; Christina Regenbogen; Dhana Wolf; Yutaka Kato; Klaus Mathiak
Journal:  Front Hum Neurosci       Date:  2013-11-14       Impact factor: 3.169

4.  Real-life speech production and perception have a shared premotor-cortical substrate.

Authors:  Olga Glanz Iljina; Johanna Derix; Rajbir Kaur; Andreas Schulze-Bonhage; Peter Auer; Ad Aertsen; Tonio Ball
Journal:  Sci Rep       Date:  2018-06-11       Impact factor: 4.379

  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.