Literature DB >> 33630253

Audiovisual Speech Perception in Children with Autism Spectrum Disorders: Evidence from Visual Phonemic Restoration.

Julia Irwin1,2, Trey Avery3,4,5, Daniel Kleinman3, Nicole Landi6,7,3.   

Abstract

Children with autism spectrum disorders have been reported to be less influenced by a speaker's face during speech perception than those with typically development. To more closely examine these reported differences, a novel visual phonemic restoration paradigm was used to assess neural signatures (event-related potentials [ERPs]) of audiovisual processing in typically developing children and in children with autism spectrum disorder. Video of a speaker saying the syllable /ba/ was paired with (1) a synthesized /ba/ or (2) a synthesized syllable derived from /ba/ in which auditory cues for the consonant were substantially weakened, thereby sounding more like /a/. The auditory stimuli are easily discriminable; however, in the context of a visual /ba/, the auditory /a/ is typically perceived as /ba/, producing a visual phonemic restoration. Only children with ASD showed a large /ba/-/a/ discrimination response in the presence of a speaker producing /ba/, suggesting reduced influence of visual speech.
© 2021. The Author(s), under exclusive licence to Springer Science+Business Media, LLC part of Springer Nature.

Entities:  

Keywords:  Audiovisual; Autism; Phonemic restoration; Speech

Mesh:

Year:  2021        PMID: 33630253     DOI: 10.1007/s10803-021-04916-x

Source DB:  PubMed          Journal:  J Autism Dev Disord        ISSN: 0162-3257


  14 in total

1.  Lexical influences in audiovisual speech perception.

Authors:  Lawrence Brancazio; Lawrence Brancazio
Journal:  J Exp Psychol Hum Percept Perform       Date:  2004-06       Impact factor: 3.332

2.  EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis.

Authors:  Arnaud Delorme; Scott Makeig
Journal:  J Neurosci Methods       Date:  2004-03-15       Impact factor: 2.390

3.  Audiovisual speech integration in autism spectrum disorders: ERP evidence for atypicalities in lexical-semantic processing.

Authors:  Odette Megnin; Atlanta Flitton; Catherine R G Jones; Michelle de Haan; Torsten Baldeweg; Tony Charman
Journal:  Autism Res       Date:  2011-12-09       Impact factor: 5.216

4.  Infants deploy selective attention to the mouth of a talking face when learning speech.

Authors:  David J Lewkowicz; Amy M Hansen-Tift
Journal:  Proc Natl Acad Sci U S A       Date:  2012-01-17       Impact factor: 11.205

5.  Nonparametric statistical testing of EEG- and MEG-data.

Authors:  Eric Maris; Robert Oostenveld
Journal:  J Neurosci Methods       Date:  2007-04-10       Impact factor: 2.390

6.  Acoustic and articulatory analysis of French vowels produced by congenitally blind adults and sighted adults.

Authors:  Lucie Ménard; Corinne Toupin; Shari R Baum; Serge Drouin; Jérôme Aubin; Mark Tiede
Journal:  J Acoust Soc Am       Date:  2013-10       Impact factor: 1.840

7.  Teaching and learning guide for audiovisual speech perception: A new approach and implications for clinical populations.

Authors:  Julia Irwin; Jacqueline Turcios
Journal:  Lang Linguist Compass       Date:  2017-03-26

8.  LIMO EEG: a toolbox for hierarchical LInear MOdeling of ElectroEncephaloGraphic data.

Authors:  Cyril R Pernet; Nicolas Chauveau; Carl Gaspar; Guillaume A Rousselet
Journal:  Comput Intell Neurosci       Date:  2011-02-21

9.  The PREP pipeline: standardized preprocessing for large-scale EEG analysis.

Authors:  Nima Bigdely-Shamlo; Tim Mullen; Christian Kothe; Kyung-Min Su; Kay A Robbins
Journal:  Front Neuroinform       Date:  2015-06-18       Impact factor: 4.081

10.  ERPLAB: an open-source toolbox for the analysis of event-related potentials.

Authors:  Javier Lopez-Calderon; Steven J Luck
Journal:  Front Hum Neurosci       Date:  2014-04-14       Impact factor: 3.169

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.