Literature DB >> 26627802

Judging sound rotation when listeners and sounds rotate: Sound source localization is a multisystem process.

William A Yost1, Xuan Zhong1, Anbar Najam1.   

Abstract

In four experiments listeners were rotated or were stationary. Sounds came from a stationary loudspeaker or rotated from loudspeaker to loudspeaker around an azimuth array. When either sounds or listeners rotate the auditory cues used for sound source localization change, but in the everyday world listeners perceive sound rotation only when sounds rotate not when listeners rotate. In the everyday world sound source locations are referenced to positions in the environment (a world-centric reference system). The auditory cues for sound source location indicate locations relative to the head (a head-centric reference system), not locations relative to the world. This paper deals with a general hypothesis that the world-centric location of sound sources requires the auditory system to have information about auditory cues used for sound source location and cues about head position. The use of visual and vestibular information in determining rotating head position in sound rotation perception was investigated. The experiments show that sound rotation perception when sources and listeners rotate was based on acoustic, visual, and, perhaps, vestibular information. The findings are consistent with the general hypotheses and suggest that sound source localization is not based just on acoustics. It is a multisystem process.

Entities:  

Mesh:

Year:  2015        PMID: 26627802     DOI: 10.1121/1.4935091

Source DB:  PubMed          Journal:  J Acoust Soc Am        ISSN: 0001-4966            Impact factor:   1.840


  17 in total

1.  Sound-source localization as a multisystem process: The Wallach azimuth illusion.

Authors:  William A Yost; M Torben Pastore; Kathryn R Pulling
Journal:  J Acoust Soc Am       Date:  2019-07       Impact factor: 1.840

2.  Discrimination of changes in spatial configuration for multiple, simultaneously presented sounds.

Authors:  William A Yost; M Torben Pastore; Yi Zhou
Journal:  J Acoust Soc Am       Date:  2019-04       Impact factor: 1.840

3.  Auditory motion tracking ability of adults with normal hearing and with bilateral cochlear implants.

Authors:  Keng Moua; Alan Kan; Heath G Jones; Sara M Misurelli; Ruth Y Litovsky
Journal:  J Acoust Soc Am       Date:  2019-04       Impact factor: 1.840

4.  How many images are in an auditory scene?

Authors:  Xuan Zhong; William A Yost
Journal:  J Acoust Soc Am       Date:  2017-04       Impact factor: 1.840

5.  Loudness of an auditory scene composed of multiple talkers.

Authors:  William A Yost; M Torben Pastore; Kathryn R Pulling
Journal:  J Acoust Soc Am       Date:  2018-09       Impact factor: 1.840

6.  Sound source localization identification accuracy: Level and duration dependencies.

Authors:  William A Yost
Journal:  J Acoust Soc Am       Date:  2016-07       Impact factor: 1.840

7.  Dependence of auditory spatial updating on vestibular, proprioceptive, and efference copy signals.

Authors:  Daria Genzel; Uwe Firzlaff; Lutz Wiegrebe; Paul R MacNeilage
Journal:  J Neurophysiol       Date:  2016-05-11       Impact factor: 2.714

8.  Early multisensory integration of self and source motion in the auditory system.

Authors:  Eyal Wigderson; Israel Nelken; Yosef Yarom
Journal:  Proc Natl Acad Sci U S A       Date:  2016-06-29       Impact factor: 11.205

9.  Sound source localization is a multisystem process.

Authors:  William A Yost; M Torben Pastore; Michael F Dorman
Journal:  Acoust Sci Technol       Date:  2020-01

10.  Auditory motion parallax.

Authors:  William A Yost
Journal:  Proc Natl Acad Sci U S A       Date:  2018-04-05       Impact factor: 11.205

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.