Literature DB >> 16362332

Hearing lips in a second language: visual articulatory information enables the perception of second language sounds.

Jordi Navarra1, Salvador Soto-Faraco.   

Abstract

We investigated the effects of visual speech information (articulatory gestures) on the perception of second language (L2) sounds. Previous studies have demonstrated that listeners often fail to hear the difference between certain non-native phonemic contrasts, such as in the case of Spanish native speakers regarding the Catalan sounds /epsilon/ and /e/. Here, we tested whether adding visual information about the articulatory gestures (i.e., lip movements) could enhance this perceptual ability. We found that, for auditory-only presentations, Spanish-dominant bilinguals failed to show sensitivity to the /epsilon/-/e/ contrast, whereas Catalan-dominant bilinguals did. Yet, when the same speech events were presented audiovisually, Spanish-dominants (as well as Catalan-dominants) were sensitive to the phonemic contrast. Finally, when the stimuli were presented only visually (in the absence of sound), none of the two groups presented clear signs of discrimination. Our results suggest that visual speech gestures enhance second language perception at the level of phonological processing especially by way of multisensory integration.

Entities:  

Mesh:

Year:  2005        PMID: 16362332     DOI: 10.1007/s00426-005-0031-5

Source DB:  PubMed          Journal:  Psychol Res        ISSN: 0340-0727


  37 in total

1.  Online processing of native and non-native phonemic contrasts in early bilinguals.

Authors:  N Sebastián-Gallés; S Soto-Faraco
Journal:  Cognition       Date:  1999-09-30

2.  Stimulus-based lexical distinctiveness as a general word-recognition mechanism.

Authors:  Sven L Mattys; Lynne E Bernstein; Edward T Auer
Journal:  Percept Psychophys       Date:  2002-05

3.  DMDX: a windows display program with millisecond accuracy.

Authors:  Kenneth I Forster; Jonathan C Forster
Journal:  Behav Res Methods Instrum Comput       Date:  2003-02

4.  Polysensory interactions along lateral temporal regions evoked by audiovisual speech.

Authors:  Tarra M Wright; Kevin A Pelphrey; Truett Allison; Martin J McKeown; Gregory McCarthy
Journal:  Cereb Cortex       Date:  2003-10       Impact factor: 5.357

5.  Audio-visual interactions with intact clearly audible speech.

Authors:  Chris Davis; Jeesun Kim
Journal:  Q J Exp Psychol A       Date:  2004-08

Review 6.  Merging the senses into a robust percept.

Authors:  Marc O Ernst; Heinrich H Bülthoff
Journal:  Trends Cogn Sci       Date:  2004-04       Impact factor: 20.229

7.  The perception of second language sounds in early bilinguals: new evidence from an implicit measure.

Authors:  Jordi Navarra; Núria Sebastián-Gallés; Salvador Soto-Faraco
Journal:  J Exp Psychol Hum Percept Perform       Date:  2005-10       Impact factor: 3.332

8.  Visual speech speeds up the neural processing of auditory speech.

Authors:  Virginie van Wassenhove; Ken W Grant; David Poeppel
Journal:  Proc Natl Acad Sci U S A       Date:  2005-01-12       Impact factor: 11.205

9.  Feature integration across perception and action: event files affect response choice.

Authors:  Bernhard Hommel
Journal:  Psychol Res       Date:  2005-12-08

10.  The bimodal perception of speech in infancy.

Authors:  P K Kuhl; A N Meltzoff
Journal:  Science       Date:  1982-12-10       Impact factor: 47.728

View more
  27 in total

1.  Bimodal bilinguals co-activate both languages during spoken comprehension.

Authors:  Anthony Shook; Viorica Marian
Journal:  Cognition       Date:  2012-07-07

2.  Cross-language perception of Cantonese vowels spoken by native and non-native speakers.

Authors:  Connie K So; Virginie Attina
Journal:  J Psycholinguist Res       Date:  2014-10

3.  Language identification from visual-only speech signals.

Authors:  Rebecca E Ronquest; Susannah V Levi; David B Pisoni
Journal:  Atten Percept Psychophys       Date:  2010-08       Impact factor: 2.199

4.  Electrophysiological evidence for a self-processing advantage during audiovisual speech integration.

Authors:  Avril Treille; Coriandre Vilain; Sonia Kandel; Marc Sato
Journal:  Exp Brain Res       Date:  2017-07-04       Impact factor: 1.972

5.  Psychobiological Responses Reveal Audiovisual Noise Differentially Challenges Speech Recognition.

Authors:  Gavin M Bidelman; Bonnie Brown; Kelsey Mankel; Caitlin Nelms Price
Journal:  Ear Hear       Date:  2020 Mar/Apr       Impact factor: 3.570

6.  Processing of audiovisually congruent and incongruent speech in school-age children with a history of specific language impairment: a behavioral and event-related potentials study.

Authors:  Natalya Kaganovich; Jennifer Schumaker; Danielle Macias; Dana Gustafson
Journal:  Dev Sci       Date:  2014-11-29

7.  Cross-modal prediction in speech depends on prior linguistic experience.

Authors:  Carolina Sánchez-García; James T Enns; Salvador Soto-Faraco
Journal:  Exp Brain Res       Date:  2013-02-06       Impact factor: 1.972

8.  Bilingualism modulates infants' selective attention to the mouth of a talking face.

Authors:  Ferran Pons; Laura Bosch; David J Lewkowicz
Journal:  Psychol Sci       Date:  2015-03-12

9.  Shared and modality-specific brain regions that mediate auditory and visual word comprehension.

Authors:  Anne Keitel; Joachim Gross; Christoph Kayser
Journal:  Elife       Date:  2020-08-24       Impact factor: 8.140

10.  The Bilingual Language Interaction Network for Comprehension of Speech.

Authors:  Anthony Shook; Viorica Marian
Journal:  Biling (Camb Engl)       Date:  2013-04-01
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.