Literature DB >> 19526435

Speech segmentation is facilitated by visual cues.

Toni Cunillera1, Estela Camara, Matti Laine, Antoni Rodriguez-Fornells.   

Abstract

Evidence from infant studies indicates that language learning can be facilitated by multimodal cues. We extended this observation to adult language learning by studying the effects of simultaneous visual cues (nonassociated object images) on speech segmentation performance. Our results indicate that segmentation of new words from a continuous speech stream is facilitated by simultaneous visual input that it is presented at or near syllables that exhibit the low transitional probability indicative of word boundaries. This indicates that temporal audio-visual contiguity helps in directing attention to word boundaries at the earliest stages of language learning. Off-boundary or arrhythmic picture sequences did not affect segmentation performance, suggesting that the language learning system can effectively disregard noninformative visual information. Detection of temporal contiguity between multimodal stimuli may be useful in both infants and second-language learners not only for facilitating speech segmentation, but also for detecting word-object relationships in natural environments.

Entities:  

Mesh:

Year:  2009        PMID: 19526435     DOI: 10.1080/17470210902888809

Source DB:  PubMed          Journal:  Q J Exp Psychol (Hove)        ISSN: 1747-0218            Impact factor:   2.143


  19 in total

1.  The role of cross-modal associations in statistical learning.

Authors:  Arit Glicksohn; Asher Cohen
Journal:  Psychon Bull Rev       Date:  2013-12

2.  Isolated words enhance statistical language learning in infancy.

Authors:  Casey Lew-Williams; Bruna Pelucchi; Jenny R Saffran
Journal:  Dev Sci       Date:  2011-08-02

3.  Learning across senses: cross-modal effects in multisensory statistical learning.

Authors:  Aaron D Mitchel; Daniel J Weiss
Journal:  J Exp Psychol Learn Mem Cogn       Date:  2011-09       Impact factor: 3.051

4.  Word segmentation from noise-band vocoded speech.

Authors:  Tina M Grieco-Calub; Katherine M Simeon; Hillary E Snyder; Casey Lew-Williams
Journal:  Lang Cogn Neurosci       Date:  2017-07-20       Impact factor: 2.331

5.  Discovering functional units in continuous speech.

Authors:  Sung-Joo Lim; Francisco Lacerda; Lori L Holt
Journal:  J Exp Psychol Hum Percept Perform       Date:  2015-05-25       Impact factor: 3.332

Review 6.  Multisensory Integration in Cochlear Implant Recipients.

Authors:  Ryan A Stevenson; Sterling W Sheffield; Iliza M Butera; René H Gifford; Mark T Wallace
Journal:  Ear Hear       Date:  2017 Sep/Oct       Impact factor: 3.570

7.  Semantic and phonological schema influence spoken word learning and overnight consolidation.

Authors:  Viktória Havas; Jsh Taylor; Lucía Vaquero; Ruth de Diego-Balaguer; Antoni Rodríguez-Fornells; Matthew H Davis
Journal:  Q J Exp Psychol (Hove)       Date:  2018-01-19       Impact factor: 2.143

8.  Visual speech segmentation: using facial cues to locate word boundaries in continuous speech.

Authors:  Aaron D Mitchel; Daniel J Weiss
Journal:  Lang Cogn Process       Date:  2014

9.  Statistical speech segmentation and word learning in parallel: scaffolding from child-directed speech.

Authors:  Daniel Yurovsky; Chen Yu; Linda B Smith
Journal:  Front Psychol       Date:  2012-10-01

10.  Multimodal integration in statistical learning: evidence from the McGurk illusion.

Authors:  Aaron D Mitchel; Morten H Christiansen; Daniel J Weiss
Journal:  Front Psychol       Date:  2014-05-16
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.