Literature DB >> 25721795

Auditory feedback in error-based learning of motor regularity.

Floris T van Vugt1, Barbara Tillmann2.   

Abstract

Music and speech are skills that require high temporal precision of motor output. A key question is how humans achieve this timing precision given the poor temporal resolution of somatosensory feedback, which is classically considered to drive motor learning. We hypothesise that auditory feedback critically contributes to learn timing, and that, similarly to visuo-spatial learning models, learning proceeds by correcting a proportion of perceived timing errors. Thirty-six participants learned to tap a sequence regularly in time. For participants in the synchronous-sound group, a tone was presented simultaneously with every keystroke. For the jittered-sound group, the tone was presented after a random delay of 10-190 ms following the keystroke, thus degrading the temporal information that the sound provided about the movement. For the mute group, no keystroke-triggered sound was presented. In line with the model predictions, participants in the synchronous-sound group were able to improve tapping regularity, whereas the jittered-sound and mute group were not. The improved tapping regularity of the synchronous-sound group also transferred to a novel sequence and was maintained when sound was subsequently removed. The present findings provide evidence that humans engage in auditory feedback error-based learning to improve movement quality (here reduce variability in sequence tapping). We thus elucidate the mechanism by which high temporal precision of movement can be achieved through sound in a way that may not be possible with less temporally precise somatosensory modalities. Furthermore, the finding that sound-supported learning generalises to novel sequences suggests potential rehabilitation applications.
Copyright © 2015 Elsevier B.V. All rights reserved.

Entities:  

Keywords:  Action–perception coupling; Auditory feedback; Feedback error-based learning; Motor learning; Movement variability; Music; Timing

Mesh:

Year:  2015        PMID: 25721795     DOI: 10.1016/j.brainres.2015.02.026

Source DB:  PubMed          Journal:  Brain Res        ISSN: 0006-8993            Impact factor:   3.252


  6 in total

1.  Online sonification for golf putting gesture: reduced variability of motor behaviour and perceptual judgement.

Authors:  Benjamin O'Brien; Brett Juhas; Marta Bieńkiewicz; Frank Buloup; Lionel Bringoux; Christophe Bourdin
Journal:  Exp Brain Res       Date:  2020-03-11       Impact factor: 1.972

2.  Transposing musical skill: sonification of movement as concurrent augmented feedback enhances learning in a bimanual task.

Authors:  John Dyer; Paul Stapleton; Matthew Rodger
Journal:  Psychol Res       Date:  2016-05-27

3.  Auditory Proprioceptive Integration: Effects of Real-Time Kinematic Auditory Feedback on Knee Proprioception.

Authors:  Shashank Ghai; Gerd Schmitz; Tong-Hun Hwang; Alfred O Effenberg
Journal:  Front Neurosci       Date:  2018-03-08       Impact factor: 4.677

4.  Advantages of melodic over rhythmic movement sonification in bimanual motor skill learning.

Authors:  J F Dyer; P Stapleton; M W M Rodger
Journal:  Exp Brain Res       Date:  2017-07-26       Impact factor: 1.972

5.  The prevalence of the Val66Met polymorphism in musicians: Possible evidence for compensatory neuroplasticity from a pilot study.

Authors:  Tara L Henechowicz; Joyce L Chen; Leonardo G Cohen; Michael H Thaut
Journal:  PLoS One       Date:  2021-06-09       Impact factor: 3.240

6.  Development and evaluation of a novel music-based therapeutic device for upper extremity movement training: A pre-clinical, single-arm trial.

Authors:  Nina Schaffert; Thenille Braun Janzen; Roy Ploigt; Sebastian Schlüter; Veronica Vuong; Michael H Thaut
Journal:  PLoS One       Date:  2020-11-19       Impact factor: 3.240

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.