Literature DB >> 22562709

Non-spatial sounds regulate eye movements and enhance visual search.

Heng Zou1, Hermann J Müller, Zhuanghua Shi.   

Abstract

Spatially uninformative sounds can enhance visual search when the sounds are synchronized with color changes of the visual target, a phenomenon referred to as "pip-and-pop" effect (van der Burg, Olivers, Bronkhorst, & Theeuwes, 2008). The present study investigated the relationship of this effect to changes in oculomotor scanning behavior induced by the sounds. The results revealed sound events to increase fixation durations upon their occurrence and to decrease the mean number of saccades. More specifically, spatially uninformative sounds facilitated the orientation of ocular scanning away from already scanned display regions not containing a target (Experiment 1) and enhanced search performance even on target-absent trials (Experiment 2). Facilitation was also observed when the sounds were presented 100 ms prior to the target or at random (Experiment 3). These findings suggest that non-spatial sounds cause a general freezing effect on oculomotor scanning behavior, an effect which in turn benefits visual search performance by temporally and spatially extended information sampling.

Entities:  

Mesh:

Year:  2012        PMID: 22562709     DOI: 10.1167/12.5.2

Source DB:  PubMed          Journal:  J Vis        ISSN: 1534-7362            Impact factor:   2.240


  17 in total

1.  Novel names extend for how long preschool children sample visual information.

Authors:  Paulo F Carvalho; Catarina Vales; Caitlin M Fausey; Linda B Smith
Journal:  J Exp Child Psychol       Date:  2017-12-26

2.  Audio-visual spatial alignment improves integration in the presence of a competing audio-visual stimulus.

Authors:  Justin T Fleming; Abigail L Noyce; Barbara G Shinn-Cunningham
Journal:  Neuropsychologia       Date:  2020-06-20       Impact factor: 3.139

3.  The influence of auditory rhythms on the speed of inferred motion.

Authors:  Timothy B Patrick; Richard B Anderson
Journal:  Atten Percept Psychophys       Date:  2021-08-25       Impact factor: 2.157

4.  Exploring the effectiveness of auditory, visual, and audio-visual sensory cues in a multiple object tracking environment.

Authors:  Julia Föcker; Polly Atkins; Foivos-Christos Vantzos; Maximilian Wilhelm; Thomas Schenk; Hauke S Meyerhoff
Journal:  Atten Percept Psychophys       Date:  2022-05-24       Impact factor: 2.157

Review 5.  The interactions of multisensory integration with endogenous and exogenous attention.

Authors:  Xiaoyu Tang; Jinglong Wu; Yong Shen
Journal:  Neurosci Biobehav Rev       Date:  2015-11-10       Impact factor: 8.989

6.  Temporal structure in audiovisual sensory selection.

Authors:  Anne Kösem; Virginie van Wassenhove
Journal:  PLoS One       Date:  2012-07-19       Impact factor: 3.240

7.  Audiovisual integration in near and far space: effects of changes in distance and stimulus effectiveness.

Authors:  N Van der Stoep; S Van der Stigchel; T C W Nijboer; M J Van der Smagt
Journal:  Exp Brain Res       Date:  2015-03-19       Impact factor: 1.972

8.  Equivalent Behavioral Facilitation to Tactile Cues in Children with Autism Spectrum Disorder.

Authors:  Girija Kadlaskar; Sophia Bergmann; Rebecca McNally Keehn; Amanda Seidl; Brandon Keehn
Journal:  Brain Sci       Date:  2021-05-13

9.  Eye Movements during Auditory Attention Predict Individual Differences in Dorsal Attention Network Activity.

Authors:  Rodrigo M Braga; Richard Z Fu; Barry M Seemungal; Richard J S Wise; Robert Leech
Journal:  Front Hum Neurosci       Date:  2016-05-09       Impact factor: 3.169

10.  Visuo-perceptual capabilities predict sensitivity for coinciding auditory and visual transients in multi-element displays.

Authors:  Hauke S Meyerhoff; Nina A Gehrer
Journal:  PLoS One       Date:  2017-09-13       Impact factor: 3.240

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.