Literature DB >> 18590910

Visual speech contributes to phonetic learning in 6-month-old infants.

Tuomas Teinonen1, Richard N Aslin, Paavo Alku, Gergely Csibra.   

Abstract

Previous research has shown that infants match vowel sounds to facial displays of vowel articulation [Kuhl, P. K., & Meltzoff, A. N. (1982). The bimodal perception of speech in infancy. Science, 218, 1138-1141; Patterson, M. L., & Werker, J. F. (1999). Matching phonetic information in lips and voice is robust in 4.5-month-old infants. Infant Behaviour & Development, 22, 237-247], and integrate seen and heard speech sounds [Rosenblum, L. D., Schmuckler, M. A., & Johnson, J. A. (1997). The McGurk effect in infants. Perception & Psychophysics, 59, 347-357; Burnham, D., & Dodd, B. (2004). Auditory-visual speech integration by prelinguistic infants: Perception of an emergent consonant in the McGurk effect. Developmental Psychobiology, 45, 204-220]. However, the role of visual speech in language development remains unknown. Our aim was to determine whether seen articulations enhance phoneme discrimination, thereby playing a role in phonetic category learning. We exposed 6-month-old infants to speech sounds from a restricted range of a continuum between /ba/ and /da/, following a unimodal frequency distribution. Synchronously with these speech sounds, one group of infants (the two-category group) saw a visual articulation of a canonical /ba/ or /da/, with the two alternative visual articulations, /ba/ and /da/, being presented according to whether the auditory token was on the /ba/ or /da/ side of the midpoint of the continuum. Infants in a second (one-category) group were presented with the same unimodal distribution of speech sounds, but every token for any particular infant was always paired with the same syllable, either a visual /ba/ or a visual /da/. A stimulus-alternation preference procedure following the exposure revealed that infants in the former, and not in the latter, group discriminated the /ba/-/da/ contrast. These results not only show that visual information about speech articulation enhances phoneme discrimination, but also that it may contribute to the learning of phoneme boundaries in infancy.

Entities:  

Mesh:

Year:  2008        PMID: 18590910     DOI: 10.1016/j.cognition.2008.05.009

Source DB:  PubMed          Journal:  Cognition        ISSN: 0010-0277


  68 in total

1.  Developmental Shifts in Detection and Attention for Auditory, Visual, and Audiovisual Speech.

Authors:  Susan Jerger; Markus F Damian; Cassandra Karl; Hervé Abdi
Journal:  J Speech Lang Hear Res       Date:  2018-12-10       Impact factor: 2.297

Review 2.  Some behavioral and neurobiological constraints on theories of audiovisual speech integration: a review and suggestions for new directions.

Authors:  Nicholas Altieri; David B Pisoni; James T Townsend
Journal:  Seeing Perceiving       Date:  2011-09-29

3.  Phonological Priming in Children with Hearing Loss: Effect of Speech Mode, Fidelity, and Lexical Status.

Authors:  Susan Jerger; Nancy Tye-Murray; Markus F Damian; Hervé Abdi
Journal:  Ear Hear       Date:  2016 Nov/Dec       Impact factor: 3.570

4.  Children with a history of SLI show reduced sensitivity to audiovisual temporal asynchrony: an ERP study.

Authors:  Natalya Kaganovich; Jennifer Schumaker; Laurence B Leonard; Dana Gustafson; Danielle Macias
Journal:  J Speech Lang Hear Res       Date:  2014-08       Impact factor: 2.297

5.  Sensitivity to Audiovisual Temporal Asynchrony in Children With a History of Specific Language Impairment and Their Peers With Typical Development: A Replication and Follow-Up Study.

Authors:  Natalya Kaganovich
Journal:  J Speech Lang Hear Res       Date:  2017-08-16       Impact factor: 2.297

6.  Selective attention to the mouth is associated with expressive language skills in monolingual and bilingual infants.

Authors:  Tawny Tsang; Natsuki Atagi; Scott P Johnson
Journal:  J Exp Child Psychol       Date:  2018-05

7.  Visual speech fills in both discrimination and identification of non-intact auditory speech in children.

Authors:  Susan Jerger; Markus F Damian; Rachel P McAlpine; Hervé Abdi
Journal:  J Child Lang       Date:  2017-07-20

8.  INFANTS' USE OF TEMPORAL AND PHONETIC INFORMATION IN THE ENCODING OF AUDIOVISUAL SPEECH.

Authors:  D Kyle Danielson; Cassie Tam; Padmapriya Kandhadai; Janet F Werker
Journal:  Can Acoust       Date:  2016-09

9.  Bilingualism modulates infants' selective attention to the mouth of a talking face.

Authors:  Ferran Pons; Laura Bosch; David J Lewkowicz
Journal:  Psychol Sci       Date:  2015-03-12

10.  Neural development of networks for audiovisual speech comprehension.

Authors:  Anthony Steven Dick; Ana Solodkin; Steven L Small
Journal:  Brain Lang       Date:  2009-09-24       Impact factor: 2.381

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.