Literature DB >> 25018577

Visual speech segmentation: using facial cues to locate word boundaries in continuous speech.

Aaron D Mitchel1, Daniel J Weiss2.   

Abstract

Speech is typically a multimodal phenomenon, yet few studies have focused on the exclusive contributions of visual cues to language acquisition. To address this gap, we investigated whether visual prosodic information can facilitate speech segmentation. Previous research has demonstrated that language learners can use lexical stress and pitch cues to segment speech and that learners can extract this information from talking faces. Thus, we created an artificial speech stream that contained minimal segmentation cues and paired it with two synchronous facial displays in which visual prosody was either informative or uninformative for identifying word boundaries. Across three familiarisation conditions (audio stream alone, facial streams alone, and paired audiovisual), learning occurred only when the facial displays were informative to word boundaries, suggesting that facial cues can help learners solve the early challenges of language acquisition.

Entities:  

Keywords:  audiovisual speech; language acquisition; multisensory integration; speech segmentation; visual prosody

Year:  2014        PMID: 25018577      PMCID: PMC4091796          DOI: 10.1080/01690965.2013.791703

Source DB:  PubMed          Journal:  Lang Cogn Process        ISSN: 0169-0965


  29 in total

1.  Using visible speech to train perception and production of speech for individuals with hearing loss.

Authors:  Dominic W Massaro; Joanna Light
Journal:  J Speech Lang Hear Res       Date:  2004-04       Impact factor: 2.297

2.  Hearing lips and seeing voices.

Authors:  H McGurk; J MacDonald
Journal:  Nature       Date:  1976 Dec 23-30       Impact factor: 49.962

3.  Learning across senses: cross-modal effects in multisensory statistical learning.

Authors:  Aaron D Mitchel; Daniel J Weiss
Journal:  J Exp Psychol Learn Mem Cogn       Date:  2011-09       Impact factor: 3.051

Review 4.  CONSPEC and CONLERN: a two-process theory of infant face recognition.

Authors:  J Morton; M H Johnson
Journal:  Psychol Rev       Date:  1991-04       Impact factor: 8.934

5.  Infants' use of synchronized visual information to separate streams of speech.

Authors:  George Hollich; Rochelle S Newman; Peter W Jusczyk
Journal:  Child Dev       Date:  2005 May-Jun

6.  Lip movement exaggerations during infant-directed speech.

Authors:  Jordan R Green; Ignatius S B Nip; Erin M Wilson; Antje S Mefferd; Yana Yunusova
Journal:  J Speech Lang Hear Res       Date:  2010-08-10       Impact factor: 2.297

7.  Infant perception of audio-visual speech synchrony.

Authors:  David J Lewkowicz
Journal:  Dev Psychol       Date:  2010-01

8.  The bimodal perception of speech in infancy.

Authors:  P K Kuhl; A N Meltzoff
Journal:  Science       Date:  1982-12-10       Impact factor: 47.728

9.  Speech perception skills of deaf infants following cochlear implantation: a first report.

Authors:  Derek M Houston; David B Pisoni; Karen Iler Kirk; Elizabeth A Ying; Richard T Miyamoto
Journal:  Int J Pediatr Otorhinolaryngol       Date:  2003-05       Impact factor: 1.675

10.  Phonotactic knowledge of word boundaries and its use in infant speech perception.

Authors:  A D Friederici; J M Wessels
Journal:  Percept Psychophys       Date:  1993-09
View more
  10 in total

1.  Audiovisual perceptual learning with multiple speakers.

Authors:  Aaron D Mitchel; Chip Gerfen; Daniel J Weiss
Journal:  J Phon       Date:  2016-03-14

2.  Desirable Difficulties in Language Learning? How Talker Variability Impacts Artificial Grammar Learning.

Authors:  Federica Bulgarelli; Daniel J Weiss
Journal:  Lang Learn       Date:  2021-07-10

3.  Early Word Segmentation Behind the Mask.

Authors:  Sónia Frota; Jovana Pejovic; Marisa Cruz; Cátia Severino; Marina Vigário
Journal:  Front Psychol       Date:  2022-05-09

4.  Multimodal integration in statistical learning: evidence from the McGurk illusion.

Authors:  Aaron D Mitchel; Morten H Christiansen; Daniel J Weiss
Journal:  Front Psychol       Date:  2014-05-16

5.  Differential Gaze Patterns on Eyes and Mouth During Audiovisual Speech Segmentation.

Authors:  Laina G Lusk; Aaron D Mitchel
Journal:  Front Psychol       Date:  2016-02-02

6.  Finding Phrases: The Interplay of Word Frequency, Phrasal Prosody and Co-speech Visual Information in Chunking Speech by Monolingual and Bilingual Adults.

Authors:  Irene de la Cruz-Pavía; Janet F Werker; Eric Vatikiotis-Bateson; Judit Gervain
Journal:  Lang Speech       Date:  2019-04-19       Impact factor: 1.500

7.  Speechreading in hearing children can be improved by training.

Authors:  Elizabeth Buchanan-Worster; Charles Hulme; Rachel Dennan; Mairéad MacSweeney
Journal:  Dev Sci       Date:  2021-06-01

8.  Synchronization by the hand: the sight of gestures modulates low-frequency activity in brain responses to continuous speech.

Authors:  Emmanuel Biau; Salvador Soto-Faraco
Journal:  Front Hum Neurosci       Date:  2015-09-24       Impact factor: 3.169

9.  Eye Movements During Visual Speech Perception in Deaf and Hearing Children.

Authors:  Elizabeth Worster; Hannah Pimperton; Amelia Ralph-Lewis; Laura Monroy; Charles Hulme; Mairéad MacSweeney
Journal:  Lang Learn       Date:  2017-09-26

10.  Finding phrases: On the role of co-verbal facial information in learning word order in infancy.

Authors:  Irene de la Cruz-Pavía; Judit Gervain; Eric Vatikiotis-Bateson; Janet F Werker
Journal:  PLoS One       Date:  2019-11-11       Impact factor: 3.240

  10 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.