Literature DB >> 33373937

Training enhances the ability of listeners to exploit visual information for auditory scene analysis.

Huriye Atilgan1, Jennifer K Bizley2.   

Abstract

The ability to use temporal relationships between cross-modal cues facilitates perception and behavior. Previously we observed that temporally correlated changes in the size of a visual stimulus and the intensity in an auditory stimulus influenced the ability of listeners to perform an auditory selective attention task (Maddox, Atilgan, Bizley, & Lee, 2015). Participants detected timbral changes in a target sound while ignoring those in a simultaneously presented masker. When the visual stimulus was temporally coherent with the target sound, performance was significantly better than when the visual stimulus was temporally coherent with the masker, despite the visual stimulus conveying no task-relevant information. Here, we trained observers to detect audiovisual temporal coherence and asked whether this changed the way in which they were able to exploit visual information in the auditory selective attention task. We observed that after training, participants were able to benefit from temporal coherence between the visual stimulus and both the target and masker streams, relative to the condition in which the visual stimulus was coherent with neither sound. However, we did not observe such changes in a second group that were trained to discriminate modulation rate differences between temporally coherent audiovisual streams, although they did show an improvement in their overall performance. A control group did not change their performance between pretest and post-test and did not change how they exploited visual information. These results provide insights into how crossmodal experience may optimize multisensory integration.
Copyright © 2020 The Author(s). Published by Elsevier B.V. All rights reserved.

Entities:  

Keywords:  Audiovisual integration; Auditory scene analysis; Selective attention; Temporal processing; Training

Mesh:

Year:  2020        PMID: 33373937      PMCID: PMC7868888          DOI: 10.1016/j.cognition.2020.104529

Source DB:  PubMed          Journal:  Cognition        ISSN: 0010-0277


  37 in total

Review 1.  Causal inference in perception.

Authors:  Ladan Shams; Ulrik R Beierholm
Journal:  Trends Cogn Sci       Date:  2010-08-11       Impact factor: 20.229

2.  Zapping the gap: Reducing the multisensory temporal binding window by means of transcranial direct current stimulation (tDCS).

Authors:  Sharon Zmigrod; Leor Zmigrod
Journal:  Conscious Cogn       Date:  2015-05-29

3.  Determinants of multisensory integration in superior colliculus neurons. I. Temporal factors.

Authors:  M A Meredith; J W Nemitz; B E Stein
Journal:  J Neurosci       Date:  1987-10       Impact factor: 6.167

4.  The detection of auditory visual desynchrony.

Authors:  N F Dixon; L Spitz
Journal:  Perception       Date:  1980       Impact factor: 1.490

5.  Transfer of Audio-Visual Temporal Training to Temporal and Spatial Audio-Visual Tasks.

Authors:  Ralf Sürig; Davide Bottari; Brigitte Röder
Journal:  Multisens Res       Date:  2018-01-01       Impact factor: 2.286

6.  Multisensory temporal integration: task and stimulus dependencies.

Authors:  Ryan A Stevenson; Mark T Wallace
Journal:  Exp Brain Res       Date:  2013-04-21       Impact factor: 1.972

7.  Auditory selective attention is enhanced by a task-irrelevant temporally coherent visual stimulus in human listeners.

Authors:  Ross K Maddox; Huriye Atilgan; Jennifer K Bizley; Adrian K C Lee
Journal:  Elife       Date:  2015-02-05       Impact factor: 8.140

8.  Perceptual learning shapes multisensory causal inference via two distinct mechanisms.

Authors:  David P McGovern; Eugenie Roudaia; Fiona N Newell; Neil W Roach
Journal:  Sci Rep       Date:  2016-04-19       Impact factor: 4.379

9.  Overlearning hyperstabilizes a skill by rapidly making neurochemical processing inhibitory-dominant.

Authors:  Kazuhisa Shibata; Yuka Sasaki; Ji Won Bang; Edward G Walsh; Maro G Machizawa; Masako Tamaki; Li-Hung Chang; Takeo Watanabe
Journal:  Nat Neurosci       Date:  2017-01-30       Impact factor: 24.884

10.  Training enhances the ability of listeners to exploit visual information for auditory scene analysis.

Authors:  Huriye Atilgan; Jennifer K Bizley
Journal:  Cognition       Date:  2020-12-26
View more
  2 in total

1.  Binding the Acoustic Features of an Auditory Source through Temporal Coherence.

Authors:  Mohsen Rezaeizadeh; Shihab Shamma
Journal:  Cereb Cortex Commun       Date:  2021-10-06

2.  Training enhances the ability of listeners to exploit visual information for auditory scene analysis.

Authors:  Huriye Atilgan; Jennifer K Bizley
Journal:  Cognition       Date:  2020-12-26
  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.