Literature DB >> 10368392

Influence of head position on the spatial representation of acoustic targets.

H H Goossens1, A J van Opstal.   

Abstract

Sound localization in humans relies on binaural differences (azimuth cues) and monaural spectral shape information (elevation cues) and is therefore the result of a neural computational process. Despite the fact that these acoustic cues are referenced with respect to the head, accurate eye movements can be generated to sounds in complete darkness. This ability necessitates the use of eye position information. So far, however, sound localization has been investigated mainly with a fixed head position, usually straight ahead. Yet the auditory system may rely on head motor information to maintain a stable and spatially accurate representation of acoustic targets in the presence of head movements. We therefore studied the influence of changes in eye-head position on auditory-guided orienting behavior of human subjects. In the first experiment, we used a visual-auditory double-step paradigm. Subjects made saccadic gaze shifts in total darkness toward brief broadband sounds presented before an intervening eye-head movement that was evoked by an earlier visual target. The data show that the preceding displacements of both eye and head are fully accounted for, resulting in spatially accurate responses. This suggests that auditory target information may be transformed into a spatial (or body-centered) frame of reference. To further investigate this possibility, we exploited the unique property of the auditory system that sound elevation is extracted independently from pinna-related spectral cues. In the absence of such cues, accurate elevation detection is not possible, even when head movements are made. This is shown in a second experiment where pure tones were localized at a fixed elevation that depended on the tone frequency rather than on the actual target elevation, both under head-fixed and -free conditions. To test, in a third experiment, whether the perceived elevation of tones relies on a head- or space-fixed target representation, eye movements were elicited toward pure tones while subjects kept their head in different vertical positions. It appeared that each tone was localized at a fixed, frequency-dependent elevation in space that shifted to a limited extent with changes in head elevation. Hence information about head position is used under static conditions too. Interestingly, the influence of head position also depended on the tone frequency. Thus tone-evoked ocular saccades typically showed a partial compensation for changes in static head position, whereas noise-evoked eye-head saccades fully compensated for intervening changes in eye-head position. We propose that the auditory localization system combines the acoustic input with head-position information to encode targets in a spatial (or body-centered) frame of reference. In this way, accurate orienting responses may be programmed despite intervening eye-head movements. A conceptual model, based on the tonotopic organization of the auditory system, is presented that may account for our findings.

Entities:  

Mesh:

Year:  1999        PMID: 10368392     DOI: 10.1152/jn.1999.81.6.2720

Source DB:  PubMed          Journal:  J Neurophysiol        ISSN: 0022-3077            Impact factor:   2.714


  33 in total

1.  Influence of static eye and head position on tone-evoked gaze shifts.

Authors:  Tom J Van Grootel; Marc M Van Wanrooij; A John Van Opstal
Journal:  J Neurosci       Date:  2011-11-30       Impact factor: 6.167

2.  Effects of self-motion on auditory scene analysis.

Authors:  Hirohito M Kondo; Daniel Pressnitzer; Iwaki Toshima; Makio Kashino
Journal:  Proc Natl Acad Sci U S A       Date:  2012-04-09       Impact factor: 11.205

3.  Alternating between pro- and antisaccades: switch-costs manifest via decoupling the spatial relations between stimulus and response.

Authors:  Matthew Heath; Caitlin Gillen; Ashna Samani
Journal:  Exp Brain Res       Date:  2015-12-12       Impact factor: 1.972

4.  Perceived touch location is coded using a gaze signal.

Authors:  Lisa M Pritchett; Laurence R Harris
Journal:  Exp Brain Res       Date:  2011-05-11       Impact factor: 1.972

5.  Target modality determines eye-head coordination in nonhuman primates: implications for gaze control.

Authors:  Luis C Populin; Abigail Z Rajala
Journal:  J Neurophysiol       Date:  2011-07-27       Impact factor: 2.714

6.  Reference frames for coding touch location depend on the task.

Authors:  Lisa M Pritchett; Michael J Carnevale; Laurence R Harris
Journal:  Exp Brain Res       Date:  2012-09-01       Impact factor: 1.972

7.  Six Degrees of Auditory Spatial Separation.

Authors:  Simon Carlile; Alex Fox; Emily Orchard-Mills; Johahn Leung; David Alais
Journal:  J Assoc Res Otolaryngol       Date:  2016-03-31

8.  Dependence of auditory spatial updating on vestibular, proprioceptive, and efference copy signals.

Authors:  Daria Genzel; Uwe Firzlaff; Lutz Wiegrebe; Paul R MacNeilage
Journal:  J Neurophysiol       Date:  2016-05-11       Impact factor: 2.714

9.  Sensitivity of the mouse to changes in azimuthal sound location: angular separation, spectral composition, and sound level.

Authors:  Paul D Allen; James R Ison
Journal:  Behav Neurosci       Date:  2010-04       Impact factor: 1.912

10.  Where did that noise come from? Memory for sound locations is exceedingly eccentric both in front and in rear space.

Authors:  Franco Delogu; Phillip McMurray
Journal:  Cogn Process       Date:  2019-06-13
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.