Literature DB >> 31283529

Psychobiological Responses Reveal Audiovisual Noise Differentially Challenges Speech Recognition.

Gavin M Bidelman1,2,3, Bonnie Brown1, Kelsey Mankel1,2, Caitlin Nelms Price1,2.   

Abstract

OBJECTIVES: In noisy environments, listeners benefit from both hearing and seeing a talker, demonstrating audiovisual (AV) cues enhance speech-in-noise (SIN) recognition. Here, we examined the relative contribution of auditory and visual cues to SIN perception and the strategies used by listeners to decipher speech in noise interference(s).
DESIGN: Normal-hearing listeners (n = 22) performed an open-set speech recognition task while viewing audiovisual TIMIT sentences presented under different combinations of signal degradation including visual (AVn), audio (AnV), or multimodal (AnVn) noise. Acoustic and visual noises were matched in physical signal-to-noise ratio. Eyetracking monitored participants' gaze to different parts of a talker's face during SIN perception.
RESULTS: As expected, behavioral performance for clean sentence recognition was better for A-only and AV compared to V-only speech. Similarly, with noise in the auditory channel (AnV and AnVn speech), performance was aided by the addition of visual cues of the talker regardless of whether the visual channel contained noise, confirming a multimodal benefit to SIN recognition. The addition of visual noise (AVn) obscuring the talker's face had little effect on speech recognition by itself. Listeners' eye gaze fixations were biased toward the eyes (decreased at the mouth) whenever the auditory channel was compromised. Fixating on the eyes was negatively associated with SIN recognition performance. Eye gazes on the mouth versus eyes of the face also depended on the gender of the talker.
CONCLUSIONS: Collectively, results suggest listeners (1) depend heavily on the auditory over visual channel when seeing and hearing speech and (2) alter their visual strategy from viewing the mouth to viewing the eyes of a talker with signal degradations, which negatively affects speech perception.

Entities:  

Mesh:

Year:  2020        PMID: 31283529      PMCID: PMC6939137          DOI: 10.1097/AUD.0000000000000755

Source DB:  PubMed          Journal:  Ear Hear        ISSN: 0196-0202            Impact factor:   3.570


  66 in total

1.  Infants deploy selective attention to the mouth of a talking face when learning speech.

Authors:  David J Lewkowicz; Amy M Hansen-Tift
Journal:  Proc Natl Acad Sci U S A       Date:  2012-01-17       Impact factor: 11.205

2.  Development of a quick speech-in-noise test for measuring signal-to-noise ratio loss in normal-hearing and hearing-impaired listeners.

Authors:  Mead C Killion; Patricia A Niquette; Gail I Gudmundsen; Lawrence J Revit; Shilpi Banerjee
Journal:  J Acoust Soc Am       Date:  2004-10       Impact factor: 1.840

3.  Hearing lips in a second language: visual articulatory information enables the perception of second language sounds.

Authors:  Jordi Navarra; Salvador Soto-Faraco
Journal:  Psychol Res       Date:  2005-12-14

4.  On the dimensionality of face space.

Authors:  Marsha Meytlis; Lawrence Sirovich
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2007-07       Impact factor: 6.226

5.  Older adults expend more listening effort than young adults recognizing audiovisual speech in noise.

Authors:  Penny Anderson Gosselin; Jean-Pierre Gagné
Journal:  Int J Audiol       Date:  2011-09-15       Impact factor: 2.117

6.  Face exploration dynamics differentiate men and women.

Authors:  Antoine Coutrot; Nicola Binetti; Charlotte Harrison; Isabelle Mareschal; Alan Johnston
Journal:  J Vis       Date:  2016-11-01       Impact factor: 2.240

7.  Developmental Trajectory of McGurk Effect Susceptibility in Children and Adults With Amblyopia.

Authors:  Cindy Narinesingh; Herbert C Goltz; Rana Arham Raashid; Agnes M F Wong
Journal:  Invest Ophthalmol Vis Sci       Date:  2015-03-05       Impact factor: 4.799

8.  A "rationalized" arcsine transform.

Authors:  G A Studebaker
Journal:  J Speech Hear Res       Date:  1985-09

9.  The effect of combined sensory and semantic components on audio-visual speech perception in older adults.

Authors:  Corrina Maguinness; Annalisa Setti; Kate E Burke; Rose Anne Kenny; Fiona N Newell
Journal:  Front Aging Neurosci       Date:  2011-12-22       Impact factor: 5.750

10.  Effect of Simultaneous Bilingualism on Speech Intelligibility across Different Masker Types, Modalities, and Signal-to-Noise Ratios in School-Age Children.

Authors:  Rachel Reetzke; Boji Pak-Wing Lam; Zilong Xie; Li Sheng; Bharath Chandrasekaran
Journal:  PLoS One       Date:  2016-12-09       Impact factor: 3.240

View more
  3 in total

1.  Acoustic noise and vision differentially warp the auditory categorization of speech.

Authors:  Gavin M Bidelman; Lauren Sigley; Gwyneth A Lewis
Journal:  J Acoust Soc Am       Date:  2019-07       Impact factor: 1.840

2.  Autonomic Nervous System Correlates of Speech Categorization Revealed Through Pupillometry.

Authors:  Gwyneth A Lewis; Gavin M Bidelman
Journal:  Front Neurosci       Date:  2020-01-10       Impact factor: 4.677

3.  Effects of training and using an audio-tactile sensory substitution device on speech-in-noise understanding.

Authors:  K Cieśla; T Wolak; A Lorens; M Mentzel; H Skarżyński; A Amedi
Journal:  Sci Rep       Date:  2022-02-25       Impact factor: 4.996

  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.