Literature DB >> 17348537

Integration efficiency for speech perception within and across sensory modalities by normal-hearing and hearing-impaired individuals.

Ken W Grant1, Jennifer B Tufts, Steven Greenberg.   

Abstract

In face-to-face speech communication, the listener extracts and integrates information from the acoustic and optic speech signals. Integration occurs within the auditory modality (i.e., across the acoustic frequency spectrum) and across sensory modalities (i.e., across the acoustic and optic signals). The difficulties experienced by some hearing-impaired listeners in understanding speech could be attributed to losses in the extraction of speech information, the integration of speech cues, or both. The present study evaluated the ability of normal-hearing and hearing-impaired listeners to integrate speech information within and across sensory modalities in order to determine the degree to which integration efficiency may be a factor in the performance of hearing-impaired listeners. Auditory-visual nonsense syllables consisting of eighteen medial consonants surrounded by the vowel [a] were processed into four nonoverlapping acoustic filter bands between 300 and 6000 Hz. A variety of one, two, three, and four filter-band combinations were presented for identification in auditory-only and auditory-visual conditions: A visual-only condition was also included. Integration efficiency was evaluated using a model of optimal integration. Results showed that normal-hearing and hearing-impaired listeners integrated information across the auditory and visual sensory modalities with a high degree of efficiency, independent of differences in auditory capabilities. However, across-frequency integration for auditory-only input was less efficient for hearing-impaired listeners. These individuals exhibited particular difficulty extracting information from the highest frequency band (4762-6000 Hz) when speech information was presented concurrently in the next lower-frequency band (1890-2381 Hz). Results suggest that integration of speech information within the auditory modality, but not across auditory and visual modalities, affects speech understanding in hearing-impaired listeners.

Entities:  

Mesh:

Year:  2007        PMID: 17348537     DOI: 10.1121/1.2405859

Source DB:  PubMed          Journal:  J Acoust Soc Am        ISSN: 0001-4966            Impact factor:   1.840


  11 in total

Review 1.  Some behavioral and neurobiological constraints on theories of audiovisual speech integration: a review and suggestions for new directions.

Authors:  Nicholas Altieri; David B Pisoni; James T Townsend
Journal:  Seeing Perceiving       Date:  2011-09-29

2.  An algorithm to improve speech recognition in noise for hearing-impaired listeners.

Authors:  Eric W Healy; Sarah E Yoho; Yuxuan Wang; DeLiang Wang
Journal:  J Acoust Soc Am       Date:  2013-10       Impact factor: 1.840

Review 3.  Sensory-Cognitive Interactions in Older Adults.

Authors:  Larry E Humes; Levi A Young
Journal:  Ear Hear       Date:  2016 Jul-Aug       Impact factor: 3.570

4.  Influence of broad auditory tuning on across-frequency integration of speech patterns.

Authors:  Eric W Healy; Kimberly A Carson
Journal:  J Speech Lang Hear Res       Date:  2010-08-05       Impact factor: 2.297

5.  Spectral integration of English speech for non-native English speakers.

Authors:  Lauren Calandruccio; Emily Buss
Journal:  J Acoust Soc Am       Date:  2017-09       Impact factor: 1.840

6.  Cross-frequency integration for consonant and vowel identification in bimodal hearing.

Authors:  Ying-Yee Kong; Louis D Braida
Journal:  J Speech Lang Hear Res       Date:  2010-11-08       Impact factor: 2.297

7.  Spectral integration of speech bands in normal-hearing and hearing-impaired listeners.

Authors:  Joseph W Hall; Emily Buss; John H Grose
Journal:  J Acoust Soc Am       Date:  2008-08       Impact factor: 1.840

8.  The contribution of visual information to the perception of speech in noise with and without informative temporal fine structure.

Authors:  Paula C Stacey; Pádraig T Kitterick; Saffron D Morris; Christian J Sumner
Journal:  Hear Res       Date:  2016-04-13       Impact factor: 3.208

9.  Seeing and hearing a word: combining eye and ear is more efficient than combining the parts of a word.

Authors:  Matthieu Dubois; David Poeppel; Denis G Pelli
Journal:  PLoS One       Date:  2013-05-29       Impact factor: 3.240

10.  The Auditory-Visual Speech Benefit on Working Memory in Older Adults with Hearing Impairment.

Authors:  Jana B Frtusova; Natalie A Phillips
Journal:  Front Psychol       Date:  2016-04-12
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.