Literature DB >> 19275331

Multisensory integration enhances phonemic restoration.

Antoine J Shahin1, Lee M Miller.   

Abstract

Phonemic restoration occurs when speech is perceived to be continuous through noisy interruptions, even when the speech signal is artificially removed from the interrupted epochs. This temporal filling-in illusion helps maintain robust comprehension in adverse environments and illustrates how contextual knowledge through the auditory modality (e.g., lexical) can improve perception. This study investigated how one important form of context, visual speech, affects phonemic restoration. The hypothesis was that audio-visual integration of speech should improve phonemic restoration, allowing the perceived continuity to span longer temporal gaps. Subjects listened to tri-syllabic words with a portion of each word replaced by white noise while watching lip-movement that was either congruent, temporally reversed (incongruent), or static. For each word, subjects judged whether the utterance sounded continuous or interrupted, where a "continuous" response indicated an illusory percept. Results showed that illusory filling-in of longer white noise durations (longer missing segments) occurred when the mouth movement was congruent with the spoken word compared to the other conditions, with no differences occurring between the static and incongruent conditions. Thus, phonemic restoration is enhanced when applying contextual knowledge through multisensory integration.

Entities:  

Mesh:

Year:  2009        PMID: 19275331      PMCID: PMC2663900          DOI: 10.1121/1.3075576

Source DB:  PubMed          Journal:  J Acoust Soc Am        ISSN: 0001-4966            Impact factor:   1.840


  36 in total

1.  Visual influences on the internal structure of phonetic categories.

Authors:  Lawrence Brancazio; Joanne L Miller; Matthew A Paré
Journal:  Percept Psychophys       Date:  2003-05

2.  Hearing lips and seeing voices.

Authors:  H McGurk; J MacDonald
Journal:  Nature       Date:  1976 Dec 23-30       Impact factor: 49.962

3.  The role of visual speech cues in reducing energetic and informational masking.

Authors:  Karen S Helfer; Richard L Freyman
Journal:  J Acoust Soc Am       Date:  2005-02       Impact factor: 1.840

4.  Increasing the intelligibility of speech through multiple phonemic restorations.

Authors:  J A Bashford; K R Riener; R M Warren
Journal:  Percept Psychophys       Date:  1992-03

5.  Visual speech speeds up the neural processing of auditory speech.

Authors:  Virginie van Wassenhove; Ken W Grant; David Poeppel
Journal:  Proc Natl Acad Sci U S A       Date:  2005-01-12       Impact factor: 11.205

6.  Encoding of illusory continuity in primary auditory cortex.

Authors:  Christopher I Petkov; Kevin N O'Connor; Mitchell L Sutter
Journal:  Neuron       Date:  2007-04-05       Impact factor: 17.173

7.  Use of speech-modulated noise adds strong "bottom-up" cues for phonemic restoration.

Authors:  J A Bashford; R M Warren; C A Brown
Journal:  Percept Psychophys       Date:  1996-04

8.  Perceptual restoration of missing speech sounds.

Authors:  R M Warren
Journal:  Science       Date:  1970-01-23       Impact factor: 47.728

9.  Optimization of experimental design in fMRI: a general framework using a genetic algorithm.

Authors:  Tor D Wager; Thomas E Nichols
Journal:  Neuroimage       Date:  2003-02       Impact factor: 6.556

10.  Occipital transcranial magnetic stimulation has opposing effects on visual and auditory stimulus detection: implications for multisensory interactions.

Authors:  Vincenzo Romei; Micah M Murray; Lotfi B Merabet; Gregor Thut
Journal:  J Neurosci       Date:  2007-10-24       Impact factor: 6.167

View more
  9 in total

1.  Phonemic restoration effect reversed in a reverberant room.

Authors:  Nirmal Kumar Srinivasan; Pavel Zahorik
Journal:  J Acoust Soc Am       Date:  2012-01       Impact factor: 1.840

2.  Bimodal bilinguals co-activate both languages during spoken comprehension.

Authors:  Anthony Shook; Viorica Marian
Journal:  Cognition       Date:  2012-07-07

3.  Dynamic changes in superior temporal sulcus connectivity during perception of noisy audiovisual speech.

Authors:  Audrey R Nath; Michael S Beauchamp
Journal:  J Neurosci       Date:  2011-02-02       Impact factor: 6.167

4.  Processing Complex Sounds Passing through the Rostral Brainstem: The New Early Filter Model.

Authors:  John E Marsh; Tom A Campbell
Journal:  Front Neurosci       Date:  2016-05-10       Impact factor: 4.677

5.  Neural restoration of degraded audiovisual speech.

Authors:  Antoine J Shahin; Jess R Kerlin; Jyoti Bhat; Lee M Miller
Journal:  Neuroimage       Date:  2011-12-10       Impact factor: 6.556

6.  Socially meaningful visual context either enhances or inhibits vocalisation processing in the macaque brain.

Authors:  Mathilda Froesel; Maëva Gacoin; Simon Clavagnier; Marc Hauser; Quentin Goudard; Suliann Ben Hamed
Journal:  Nat Commun       Date:  2022-08-19       Impact factor: 17.694

7.  Children use visual speech to compensate for non-intact auditory speech.

Authors:  Susan Jerger; Markus F Damian; Nancy Tye-Murray; Hervé Abdi
Journal:  J Exp Child Psychol       Date:  2014-07-04

8.  Speech cues contribute to audiovisual spatial integration.

Authors:  Christopher W Bishop; Lee M Miller
Journal:  PLoS One       Date:  2011-08-31       Impact factor: 3.240

9.  Failing to get the gist of what's being said: background noise impairs higher-order cognitive processing.

Authors:  John E Marsh; Robert Ljung; Anatole Nöstl; Emma Threadgold; Tom A Campbell
Journal:  Front Psychol       Date:  2015-05-21
  9 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.