Literature DB >> 2521904

Speech perception using combinations of auditory, visual, and tactile information.

P J Blamey1, R S Cowan, J I Alcantara, L A Whitford, G M Clark.   

Abstract

Four normally-hearing subjects were trained and tested with all combinations of a highly-degraded auditory input, a visual input via lipreading, and a tactile input using a multichannel electrotactile speech processor. The speech perception of the subjects was assessed with closed sets of vowels, consonants, and multisyllabic words; with open sets of words and sentences, and with speech tracking. When the visual input was added to any combination of other inputs, a significant improvement occurred for every test. Similarly, the auditory input produced a significant improvement for all tests except closed-set vowel recognition. The tactile input produced scores that were significantly greater than chance in isolation, but combined less effectively with the other modalities. The addition of the tactile input did produce significant improvements for vowel recognition in the auditory-tactile condition, for consonant recognition in the auditory-tactile and visual-tactile conditions, and in open-set word recognition in the visual-tactile condition. Information transmission analysis of the features of vowels and consonants indicated that the information from auditory and visual inputs were integrated much more effectively than information from the tactile input. The less effective combination might be due to lack of training with the tactile input, or to more fundamental limitations in the processing of multimodal stimuli.

Mesh:

Year:  1989        PMID: 2521904

Source DB:  PubMed          Journal:  J Rehabil Res Dev        ISSN: 0748-7711


  14 in total

1.  Intra- versus intermodal integration in young and older adults.

Authors:  Brent P Spehar; Nancy Tye-Murray; Mitchell S Sommers
Journal:  J Acoust Soc Am       Date:  2008-05       Impact factor: 1.840

2.  Tactile enhancement of auditory and visual speech perception in untrained perceivers.

Authors:  Bryan Gick; Kristín M Jóhannsdóttir; Diana Gibraiel; Jeff Mühlbauer
Journal:  J Acoust Soc Am       Date:  2008-04       Impact factor: 1.840

3.  Eye Can Hear Clearly Now: Inverse Effectiveness in Natural Audiovisual Speech Processing Relies on Long-Term Crossmodal Temporal Integration.

Authors:  Michael J Crosse; Giovanni M Di Liberto; Edmund C Lalor
Journal:  J Neurosci       Date:  2016-09-21       Impact factor: 6.167

Review 4.  Prediction and constraint in audiovisual speech perception.

Authors:  Jonathan E Peelle; Mitchell S Sommers
Journal:  Cortex       Date:  2015-03-20       Impact factor: 4.027

5.  Age-related differences in inhibitory control predict audiovisual speech perception.

Authors:  Avanti Dey; Mitchell S Sommers
Journal:  Psychol Aging       Date:  2015-06-29

6.  Auditory and visual lexical neighborhoods in audiovisual speech perception.

Authors:  Nancy Tye-Murray; Mitchell Sommers; Brent Spehar
Journal:  Trends Amplif       Date:  2007-12

7.  Lipreading and audiovisual speech recognition across the adult lifespan: Implications for audiovisual integration.

Authors:  Nancy Tye-Murray; Brent Spehar; Joel Myerson; Sandra Hale; Mitchell Sommers
Journal:  Psychol Aging       Date:  2016-06

8.  Predicting Audiovisual Word Recognition in Noisy Situations: Toward Precision Audiology.

Authors:  Joel Myerson; Nancy Tye-Murray; Brent Spehar; Sandra Hale; Mitchell Sommers
Journal:  Ear Hear       Date:  2021 Nov-Dec 01       Impact factor: 3.570

9.  Multi-time resolution analysis of speech: evidence from psychophysics.

Authors:  Maria Chait; Steven Greenberg; Takayuki Arai; Jonathan Z Simon; David Poeppel
Journal:  Front Neurosci       Date:  2015-06-16       Impact factor: 4.677

10.  Acoustic puncture assist device versus loss of resistance technique for epidural space identification.

Authors:  Amit Kumar Mittal; Nitesh Goel; Itee Chowdhury; Shagun Bhatia Shah; Brijesh Pratap Singh; Pradeep Jakhar
Journal:  Indian J Anaesth       Date:  2016-05
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.