Literature DB >> 24400652

A visual or tactile signal makes auditory speech detection more efficient by reducing uncertainty.

Bosco S Tjan1, Ewen Chao, Lynne E Bernstein.   

Abstract

Acoustic speech is easier to detect in noise when the talker can be seen. This finding could be explained by integration of multisensory inputs or refinement of auditory processing from visual guidance. In two experiments, we studied two-interval forced-choice detection of an auditory 'ba' in acoustic noise, paired with various visual and tactile stimuli that were identically presented in the two observation intervals. Detection thresholds were reduced under the multisensory conditions vs. the auditory-only condition, even though the visual and/or tactile stimuli alone could not inform the correct response. Results were analysed relative to an ideal observer for which intrinsic (internal) noise and efficiency were independent contributors to detection sensitivity. Across experiments, intrinsic noise was unaffected by the multisensory stimuli, arguing against the merging (integrating) of multisensory inputs into a unitary speech signal, but sampling efficiency was increased to varying degrees, supporting refinement of knowledge about the auditory stimulus. The steepness of the psychometric functions decreased with increasing sampling efficiency, suggesting that the 'task-irrelevant' visual and tactile stimuli reduced uncertainty about the acoustic signal. Visible speech was not superior for enhancing auditory speech detection. Our results reject multisensory neuronal integration and speech-specific neural processing as explanations for the enhanced auditory speech detection under noisy conditions. Instead, they support a more rudimentary form of multisensory interaction: the otherwise task-irrelevant sensory systems inform the auditory system about when to listen.
© 2014 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

Entities:  

Keywords:  ideal-observer analysis; multisensory enhancement; speech detection

Mesh:

Year:  2014        PMID: 24400652      PMCID: PMC3997613          DOI: 10.1111/ejn.12471

Source DB:  PubMed          Journal:  Eur J Neurosci        ISSN: 0953-816X            Impact factor:   3.386


  41 in total

1.  The timing and laminar profile of converging inputs to multisensory areas of the macaque neocortex.

Authors:  Charles E Schroeder; John J Foxe
Journal:  Brain Res Cogn Brain Res       Date:  2002-06

2.  Hands help hearing: facilitatory audiotactile interaction at low sound-intensity levels.

Authors:  Martin Schürmann; Gina Caetano; Veikko Jousmäki; Riitta Hari
Journal:  J Acoust Soc Am       Date:  2004-02       Impact factor: 1.840

3.  Uncertainty and invariance in the human visual cortex.

Authors:  Bosco S Tjan; Vaia Lestou; Zoe Kourtzi
Journal:  J Neurophysiol       Date:  2006-05-24       Impact factor: 2.714

4.  Tactile enhancement of auditory detection and perceived loudness.

Authors:  Helge Gillmeister; Martin Eimer
Journal:  Brain Res       Date:  2007-03-20       Impact factor: 3.252

5.  Visual modulation of neurons in auditory cortex.

Authors:  Christoph Kayser; Christopher I Petkov; Nikos K Logothetis
Journal:  Cereb Cortex       Date:  2008-01-06       Impact factor: 5.357

6.  Quantified acoustic-optical speech signal incongruity identifies cortical sites of audiovisual speech processing.

Authors:  Lynne E Bernstein; Zhong-Lin Lu; Jintao Jiang
Journal:  Brain Res       Date:  2008-04-18       Impact factor: 3.252

7.  Signal detection analysis of effect of white noise intensity on sensitivity to visual flicker.

Authors:  D W Harper
Journal:  Percept Mot Skills       Date:  1979-06

8.  Ideal observer analysis of crowding and the reduction of crowding through learning.

Authors:  Gerald J Sun; Susana T L Chung; Bosco S Tjan
Journal:  J Vis       Date:  2010-05-01       Impact factor: 2.240

9.  Onset timing of cross-sensory activations and multisensory interactions in auditory and visual sensory cortices.

Authors:  Tommi Raij; Jyrki Ahveninen; Fa-Hsuan Lin; Thomas Witzel; Iiro P Jääskeläinen; Benjamin Letham; Emily Israeli; Cherif Sahyoun; Christos Vasios; Steven Stufflebeam; Matti Hämäläinen; John W Belliveau
Journal:  Eur J Neurosci       Date:  2010-05       Impact factor: 3.386

10.  Enhanced visual speech perception in individuals with early-onset hearing impairment.

Authors:  Edward T Auer; Lynne E Bernstein
Journal:  J Speech Lang Hear Res       Date:  2007-10       Impact factor: 2.297

View more
  8 in total

1.  Developmental Shifts in Detection and Attention for Auditory, Visual, and Audiovisual Speech.

Authors:  Susan Jerger; Markus F Damian; Cassandra Karl; Hervé Abdi
Journal:  J Speech Lang Hear Res       Date:  2018-12-10       Impact factor: 2.297

2.  Free viewing of talking faces reveals mouth and eye preferring regions of the human superior temporal sulcus.

Authors:  Johannes Rennig; Michael S Beauchamp
Journal:  Neuroimage       Date:  2018-08-06       Impact factor: 6.556

3.  Detection and Attention for Auditory, Visual, and Audiovisual Speech in Children with Hearing Loss.

Authors:  Susan Jerger; Markus F Damian; Cassandra Karl; Hervé Abdi
Journal:  Ear Hear       Date:  2020 May/Jun       Impact factor: 3.570

4.  Congruent Visual Speech Enhances Cortical Entrainment to Continuous Auditory Speech in Noise-Free Conditions.

Authors:  Michael J Crosse; John S Butler; Edmund C Lalor
Journal:  J Neurosci       Date:  2015-10-21       Impact factor: 6.167

5.  Rethinking Social Cognition in Light of Psychosis: Reciprocal Implications for Cognition and Psychopathology.

Authors:  Vaughan Bell; Kathryn L Mills; Gemma Modinos; Sam Wilkinson
Journal:  Clin Psychol Sci       Date:  2017-02-10

6.  Visual speech discrimination and identification of natural and synthetic consonant stimuli.

Authors:  Benjamin T Files; Bosco S Tjan; Jintao Jiang; Lynne E Bernstein
Journal:  Front Psychol       Date:  2015-07-13

7.  Comparison of informational vs. energetic masking effects on speechreading performance.

Authors:  Björn Lidestam; Johan Holgersson; Shahram Moradi
Journal:  Front Psychol       Date:  2014-06-24

8.  Audiovisual spoken word training can promote or impede auditory-only perceptual learning: prelingually deafened adults with late-acquired cochlear implants versus normal hearing adults.

Authors:  Lynne E Bernstein; Silvio P Eberhardt; Edward T Auer
Journal:  Front Psychol       Date:  2014-08-26
  8 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.