Literature DB >> 22307596

Infants deploy selective attention to the mouth of a talking face when learning speech.

David J Lewkowicz1, Amy M Hansen-Tift.   

Abstract

The mechanisms underlying the acquisition of speech-production ability in human infancy are not well understood. We tracked 4-12-mo-old English-learning infants' and adults' eye gaze while they watched and listened to a female reciting a monologue either in their native (English) or nonnative (Spanish) language. We found that infants shifted their attention from the eyes to the mouth between 4 and 8 mo of age regardless of language and then began a shift back to the eyes at 12 mo in response to native but not nonnative speech. We posit that the first shift enables infants to gain access to redundant audiovisual speech cues that enable them to learn their native speech forms and that the second shift reflects growing native-language expertise that frees them to shift attention to the eyes to gain access to social cues. On this account, 12-mo-old infants do not shift attention to the eyes when exposed to nonnative speech because increasing native-language expertise and perceptual narrowing make it more difficult to process nonnative speech and require them to continue to access redundant audiovisual cues. Overall, the current findings demonstrate that the development of speech production capacity relies on changes in selective audiovisual attention and that this depends critically on early experience.

Entities:  

Mesh:

Year:  2012        PMID: 22307596      PMCID: PMC3277111          DOI: 10.1073/pnas.1114783109

Source DB:  PubMed          Journal:  Proc Natl Acad Sci U S A        ISSN: 0027-8424            Impact factor:   11.205


  40 in total

1.  Is speech learning 'gated' by the social brain?

Authors:  Patricia K Kuhl
Journal:  Dev Sci       Date:  2007-01

2.  The pupil as a measure of emotional arousal and autonomic activation.

Authors:  Margaret M Bradley; Laura Miccoli; Miguel A Escrig; Peter J Lang
Journal:  Psychophysiology       Date:  2008-02-11       Impact factor: 4.016

3.  Point-light facial displays enhance comprehension of speech in noise.

Authors:  L D Rosenblum; J A Johnson; H M Saldaña
Journal:  J Speech Hear Res       Date:  1996-12

4.  Use of visual information for phonetic perception.

Authors:  Q Summerfield
Journal:  Phonetica       Date:  1979       Impact factor: 1.759

5.  Discrimination of temporal synchrony in intermodal events by children with autism and children with developmental disabilities without autism.

Authors:  James M Bebko; Jonathan A Weiss; Jenny L Demark; Pamela Gomez
Journal:  J Child Psychol Psychiatry       Date:  2006-01       Impact factor: 8.982

6.  Early visual deprivation impairs multisensory interactions in humans.

Authors:  Lisa Putzar; Ines Goerendt; Kathrin Lange; Frank Rösler; Brigitte Röder
Journal:  Nat Neurosci       Date:  2007-09-16       Impact factor: 24.884

7.  Infant perception of audio-visual speech synchrony.

Authors:  David J Lewkowicz
Journal:  Dev Psychol       Date:  2010-01

8.  The natural statistics of audiovisual speech.

Authors:  Chandramouli Chandrasekaran; Andrea Trubanova; Sébastien Stillittano; Alice Caplier; Asif A Ghazanfar
Journal:  PLoS Comput Biol       Date:  2009-07-17       Impact factor: 4.475

9.  The bimodal perception of speech in infancy.

Authors:  P K Kuhl; A N Meltzoff
Journal:  Science       Date:  1982-12-10       Impact factor: 47.728

10.  Social feedback to infants' babbling facilitates rapid phonological learning.

Authors:  Michael H Goldstein; Jennifer A Schwade
Journal:  Psychol Sci       Date:  2008-05
View more
  141 in total

1.  Phonological Priming in Children with Hearing Loss: Effect of Speech Mode, Fidelity, and Lexical Status.

Authors:  Susan Jerger; Nancy Tye-Murray; Markus F Damian; Hervé Abdi
Journal:  Ear Hear       Date:  2016 Nov/Dec       Impact factor: 3.570

2.  Children with a history of SLI show reduced sensitivity to audiovisual temporal asynchrony: an ERP study.

Authors:  Natalya Kaganovich; Jennifer Schumaker; Laurence B Leonard; Dana Gustafson; Danielle Macias
Journal:  J Speech Lang Hear Res       Date:  2014-08       Impact factor: 2.297

3.  The redeployment of attention to the mouth of a talking face during the second year of life.

Authors:  Anne Hillairet de Boisferon; Amy H Tift; Nicholas J Minar; David J Lewkowicz
Journal:  J Exp Child Psychol       Date:  2018-04-05

4.  Selective attention to the mouth is associated with expressive language skills in monolingual and bilingual infants.

Authors:  Tawny Tsang; Natsuki Atagi; Scott P Johnson
Journal:  J Exp Child Psychol       Date:  2018-05

5.  Visual speech fills in both discrimination and identification of non-intact auditory speech in children.

Authors:  Susan Jerger; Markus F Damian; Rachel P McAlpine; Hervé Abdi
Journal:  J Child Lang       Date:  2017-07-20

6.  The impact of bilingual environments on selective attention in infancy.

Authors:  Kyle J Comishen; Ellen Bialystok; Scott A Adler
Journal:  Dev Sci       Date:  2019-01-30

7.  Cross-modal prediction in speech depends on prior linguistic experience.

Authors:  Carolina Sánchez-García; James T Enns; Salvador Soto-Faraco
Journal:  Exp Brain Res       Date:  2013-02-06       Impact factor: 1.972

Review 8.  Early experience and multisensory perceptual narrowing.

Authors:  David J Lewkowicz
Journal:  Dev Psychobiol       Date:  2014-01-16       Impact factor: 3.038

9.  Bilingualism modulates infants' selective attention to the mouth of a talking face.

Authors:  Ferran Pons; Laura Bosch; David J Lewkowicz
Journal:  Psychol Sci       Date:  2015-03-12

10.  The Multisensory Attention Assessment Protocol (MAAP): Characterizing individual differences in multisensory attention skills in infants and children and relations with language and cognition.

Authors:  Lorraine E Bahrick; James Torrence Todd; Kasey C Soska
Journal:  Dev Psychol       Date:  2018-10-25
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.