Literature DB >> 7823561

Effects of phonetic context on audio-visual intelligibility of French.

C Benoît1, T Mohamadi, S Kandel.   

Abstract

Bimodal perception leads to better speech understanding than auditory perception alone. We evaluated the overall benefit of lip-reading on natural utterances of French produced by a single speaker. Eighteen French subjects with good audition and vision were administered a closed set identification test of VCVCV nonsense words consisting of three vowels [i, a, y] and six consonants [b, v, z, 3, R, l]. Stimuli were presented under both auditory and audio-visual conditions with white noise added at various signal-to-noise ratios. Identification scores were higher in the bimodal condition than in the auditory-alone condition, especially in situations where acoustic information was reduced. The auditory and audio-visual intelligibility of the three vowels [i, a, y] averaged over the six consonantal contexts was evaluated as well. Two different hierarchies of intelligibility were found. Auditorily, [a] was most intelligible, followed by [i] and then by [y]; whereas visually [y] was most intelligible, followed by [a] and [i]. We also quantified the contextual effects of the three vowels on the auditory and audio-visual intelligibility of the consonants. Both the auditory and the audio-visual intelligibility of surrounding consonants was highest in the [a] context, followed by the [i] context and lastly the [y] context.

Entities:  

Mesh:

Year:  1994        PMID: 7823561     DOI: 10.1044/jshr.3705.1195

Source DB:  PubMed          Journal:  J Speech Hear Res        ISSN: 0022-4685


  12 in total

1.  Perceptual fusion and stimulus coincidence in the cross-modal integration of speech.

Authors:  Lee M Miller; Mark D'Esposito
Journal:  J Neurosci       Date:  2005-06-22       Impact factor: 6.167

2.  Cross-language perception of Cantonese vowels spoken by native and non-native speakers.

Authors:  Connie K So; Virginie Attina
Journal:  J Psycholinguist Res       Date:  2014-10

Review 3.  On the recognition of emotional vocal expressions: motivations for a holistic approach.

Authors:  Anna Esposito; Antonietta M Esposito
Journal:  Cogn Process       Date:  2012-08-08

4.  Silent articulation modulates auditory and audiovisual speech perception.

Authors:  Marc Sato; Emilie Troille; Lucie Ménard; Marie-Agnès Cathiard; Vincent Gracco
Journal:  Exp Brain Res       Date:  2013-04-17       Impact factor: 1.972

5.  Electrophysiological evidence for a self-processing advantage during audiovisual speech integration.

Authors:  Avril Treille; Coriandre Vilain; Sonia Kandel; Marc Sato
Journal:  Exp Brain Res       Date:  2017-07-04       Impact factor: 1.972

6.  Phonetic category recalibration: What are the categories?

Authors:  Eva Reinisch; David R Wozny; Holger Mitterer; Lori L Holt
Journal:  J Phon       Date:  2014-07-01

7.  A multisensory cortical network for understanding speech in noise.

Authors:  Christopher W Bishop; Lee M Miller
Journal:  J Cogn Neurosci       Date:  2009-09       Impact factor: 3.225

8.  The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception.

Authors:  Avril Treille; Coriandre Vilain; Marc Sato
Journal:  Front Psychol       Date:  2014-05-13

9.  On the Links Among Face Processing, Language Processing, and Narrowing During Development.

Authors:  Olivier Pascalis; Hélène Loevenbruck; Paul C Quinn; Sonia Kandel; James W Tanaka; Kang Lee
Journal:  Child Dev Perspect       Date:  2014-06

10.  A possible neurophysiological correlate of audiovisual binding and unbinding in speech perception.

Authors:  Attigodu C Ganesh; Frédéric Berthommier; Coriandre Vilain; Marc Sato; Jean-Luc Schwartz
Journal:  Front Psychol       Date:  2014-11-26
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.