Literature DB >> 23463919

The influence of static eye and head position on the ventriloquist effect.

Denise C P B M Van Barneveld1, Marc M Van Wanrooij.   

Abstract

Orienting responses to audiovisual events have shorter reaction times and better accuracy and precision when images and sounds in the environment are aligned in space and time. How the brain constructs an integrated audiovisual percept is a computational puzzle because the auditory and visual senses are represented in different reference frames: the retina encodes visual locations with respect to the eyes; whereas the sound localisation cues are referenced to the head. In the well-known ventriloquist effect, the auditory spatial percept of the ventriloquist's voice is attracted toward the synchronous visual image of the dummy, but does this visual bias on sound localisation operate in a common reference frame by correctly taking into account eye and head position? Here we studied this question by independently varying initial eye and head orientations, and the amount of audiovisual spatial mismatch. Human subjects pointed head and/or gaze to auditory targets in elevation, and were instructed to ignore co-occurring visual distracters. Results demonstrate that different initial head and eye orientations are accurately and appropriately incorporated into an audiovisual response. Effectively, sounds and images are perceptually fused according to their physical locations in space independent of an observer's point of view. Implications for neurophysiological findings and modelling efforts that aim to reconcile sensory and motor signals for goal-directed behaviour are discussed.
© 2013 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

Entities:  

Mesh:

Year:  2013        PMID: 23463919     DOI: 10.1111/ejn.12176

Source DB:  PubMed          Journal:  Eur J Neurosci        ISSN: 0953-816X            Impact factor:   3.386


  6 in total

1.  Predicting auditory space calibration from recent multisensory experience.

Authors:  Catarina Mendonça; Andreas Escher; Steven van de Par; Hans Colonius
Journal:  Exp Brain Res       Date:  2015-03-21       Impact factor: 1.972

2.  Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.

Authors:  Daniel S Pages; Jennifer M Groh
Journal:  PLoS One       Date:  2013-08-29       Impact factor: 3.240

3.  Temporal Cortex Activation to Audiovisual Speech in Normal-Hearing and Cochlear Implant Users Measured with Functional Near-Infrared Spectroscopy.

Authors:  Luuk P H van de Rijt; A John van Opstal; Emmanuel A M Mylanus; Louise V Straatman; Hai Yin Hu; Ad F M Snik; Marc M van Wanrooij
Journal:  Front Hum Neurosci       Date:  2016-02-11       Impact factor: 3.169

4.  Learning to localise weakly-informative sound spectra with and without feedback.

Authors:  Bahram Zonooz; Elahe Arani; A John Van Opstal
Journal:  Sci Rep       Date:  2018-12-18       Impact factor: 4.379

5.  Sound Localization in Real-Time Vocoded Cochlear-Implant Simulations With Normal-Hearing Listeners.

Authors:  Sebastian A Ausili; Bradford Backus; Martijn J H Agterberg; A John van Opstal; Marc M van Wanrooij
Journal:  Trends Hear       Date:  2019 Jan-Dec       Impact factor: 3.293

6.  The Principle of Inverse Effectiveness in Audiovisual Speech Perception.

Authors:  Luuk P H van de Rijt; Anja Roye; Emmanuel A M Mylanus; A John van Opstal; Marc M van Wanrooij
Journal:  Front Hum Neurosci       Date:  2019-09-26       Impact factor: 3.169

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.