Literature DB >> 32010030

The Influence of Auditory Cues on Bodily and Movement Perception.

Tasha R Stanton1,2, Charles Spence3.   

Abstract

The sounds that result from our movement and that mark the outcome of our actions typically convey useful information concerning the state of our body and its movement, as well as providing pertinent information about the stimuli with which we are interacting. Here we review the rapidly growing literature investigating the influence of non-veridical auditory cues (i.e., inaccurate in terms of their context, timing, and/or spectral distribution) on multisensory body and action perception, and on motor behavior. Inaccurate auditory cues provide a unique opportunity to study cross-modal processes: the ability to detect the impact of each sense when they provide a slightly different message is greater. Additionally, given that similar cross-modal processes likely occur regardless of the accuracy or inaccuracy of sensory input, studying incongruent interactions are likely to also help us predict interactions between congruent inputs. The available research convincingly demonstrates that perceptions of the body, of movement, and of surface contact features (e.g., roughness) are influenced by the addition of non-veridical auditory cues. Moreover, auditory cues impact both motor behavior and emotional valence, the latter showing that sounds that are highly incongruent with the performed movement induce feelings of unpleasantness (perhaps associated with lower processing fluency). Such findings are relevant to the design of auditory cues associated with product interaction, and the use of auditory cues in sport performance and therapeutic situations given the impact on motor behavior.
Copyright © 2020 Stanton and Spence.

Entities:  

Keywords:  auditory; body perception; emotional valence; movement; multisensory integration; perception

Year:  2020        PMID: 32010030      PMCID: PMC6978806          DOI: 10.3389/fpsyg.2019.03001

Source DB:  PubMed          Journal:  Front Psychol        ISSN: 1664-1078


  123 in total

1.  Lower pitch is larger, yet falling pitches shrink.

Authors:  Zohar Eitan; Asi Schupak; Alex Gotler; Lawrence E Marks
Journal:  Exp Psychol       Date:  2014

2.  Humans integrate visual and haptic information in a statistically optimal fashion.

Authors:  Marc O Ernst; Martin S Banks
Journal:  Nature       Date:  2002-01-24       Impact factor: 49.962

Review 3.  Crossmodal correspondences: a tutorial review.

Authors:  Charles Spence
Journal:  Atten Percept Psychophys       Date:  2011-05       Impact factor: 2.199

4.  Affective reactions to acoustic stimuli.

Authors:  M M Bradley; P J Lang
Journal:  Psychophysiology       Date:  2000-03       Impact factor: 4.016

5.  Simulating sensory-motor incongruence in healthy volunteers: implications for a cortical model of pain.

Authors:  C S McCabe; R C Haigh; P W Halligan; D R Blake
Journal:  Rheumatology (Oxford)       Date:  2005-01-11       Impact factor: 7.580

Review 6.  Somatoparaphrenia: a body delusion. A review of the neuropsychological literature.

Authors:  Giuseppe Vallar; Roberta Ronchi
Journal:  Exp Brain Res       Date:  2008-09-24       Impact factor: 1.972

7.  Temporal frequency channels are linked across audition and touch.

Authors:  Jeffrey M Yau; Jonathon B Olenczak; John F Dammann; Sliman J Bensmaia
Journal:  Curr Biol       Date:  2009-03-05       Impact factor: 10.834

Review 8.  Assessing the Role of the 'Unity Assumption' on Multisensory Integration: A Review.

Authors:  Yi-Chuan Chen; Charles Spence
Journal:  Front Psychol       Date:  2017-03-31

Review 9.  A Review on the Relationship Between Sound and Movement in Sports and Rehabilitation.

Authors:  Nina Schaffert; Thenille Braun Janzen; Klaus Mattes; Michael H Thaut
Journal:  Front Psychol       Date:  2019-02-12

10.  Rapid learning of associations between sound and action through observed movement. A TMS study.

Authors:  Jacques Launay; Roger T Dean; Freya Bailes
Journal:  Psychomusicology       Date:  2016-03
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.