Literature DB >> 15147940

Seeing to hear better: evidence for early audio-visual interactions in speech identification.

Jean-Luc Schwartz1, Frédéric Berthommier, Christophe Savariaux.   

Abstract

Lip reading is the ability to partially understand speech by looking at the speaker's lips. It improves the intelligibility of speech in noise when audio-visual perception is compared with audio-only perception. A recent set of experiments showed that seeing the speaker's lips also enhances sensitivity to acoustic information, decreasing the auditory detection threshold of speech embedded in noise [J. Acoust. Soc. Am. 109 (2001) 2272; J. Acoust. Soc. Am. 108 (2000) 1197]. However, detection is different from comprehension, and it remains to be seen whether improved sensitivity also results in an intelligibility gain in audio-visual speech perception. In this work, we use an original paradigm to show that seeing the speaker's lips enables the listener to hear better and hence to understand better. The audio-visual stimuli used here could not be differentiated by lip reading per se since they contained exactly the same lip gesture matched with different compatible speech sounds. Nevertheless, the noise-masked stimuli were more intelligible in the audio-visual condition than in the audio-only condition due to the contribution of visual information to the extraction of acoustic cues. Replacing the lip gesture by a non-speech visual input with exactly the same time course, providing the same temporal cues for extraction, removed the intelligibility benefit. This early contribution to audio-visual speech identification is discussed in relationships with recent neurophysiological data on audio-visual perception.

Entities:  

Mesh:

Year:  2004        PMID: 15147940     DOI: 10.1016/j.cognition.2004.01.006

Source DB:  PubMed          Journal:  Cognition        ISSN: 0010-0277


  71 in total

1.  Developmental Shifts in Detection and Attention for Auditory, Visual, and Audiovisual Speech.

Authors:  Susan Jerger; Markus F Damian; Cassandra Karl; Hervé Abdi
Journal:  J Speech Lang Hear Res       Date:  2018-12-10       Impact factor: 2.297

Review 2.  Perceptuo-motor interactions in the perceptual organization of speech: evidence from the verbal transformation effect.

Authors:  Anahita Basirat; Jean-Luc Schwartz; Marc Sato
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2012-04-05       Impact factor: 6.237

3.  Multistage audiovisual integration of speech: dissociating identification and detection.

Authors:  Kasper Eskelund; Jyrki Tuomainen; Tobias S Andersen
Journal:  Exp Brain Res       Date:  2010-12-25       Impact factor: 1.972

Review 4.  Cued speech for enhancing speech perception and first language development of children with cochlear implants.

Authors:  Jacqueline Leybaert; Carol J LaSasso
Journal:  Trends Amplif       Date:  2010-06

5.  Evidence that cochlear-implanted deaf patients are better multisensory integrators.

Authors:  J Rouger; S Lagleyre; B Fraysse; S Deneve; O Deguine; P Barone
Journal:  Proc Natl Acad Sci U S A       Date:  2007-04-02       Impact factor: 11.205

6.  Dynamic faces speed up the onset of auditory cortical spiking responses during vocal detection.

Authors:  Chandramouli Chandrasekaran; Luis Lemus; Asif A Ghazanfar
Journal:  Proc Natl Acad Sci U S A       Date:  2013-11-11       Impact factor: 11.205

7.  Interactions between the superior temporal sulcus and auditory cortex mediate dynamic face/voice integration in rhesus monkeys.

Authors:  Asif A Ghazanfar; Chandramouli Chandrasekaran; Nikos K Logothetis
Journal:  J Neurosci       Date:  2008-04-23       Impact factor: 6.167

8.  A visual or tactile signal makes auditory speech detection more efficient by reducing uncertainty.

Authors:  Bosco S Tjan; Ewen Chao; Lynne E Bernstein
Journal:  Eur J Neurosci       Date:  2014-01-09       Impact factor: 3.386

9.  Cross-modal prediction in speech depends on prior linguistic experience.

Authors:  Carolina Sánchez-García; James T Enns; Salvador Soto-Faraco
Journal:  Exp Brain Res       Date:  2013-02-06       Impact factor: 1.972

10.  Neural development of networks for audiovisual speech comprehension.

Authors:  Anthony Steven Dick; Ana Solodkin; Steven L Small
Journal:  Brain Lang       Date:  2009-09-24       Impact factor: 2.381

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.