Literature DB >> 21859207

Musical emotions: predicting second-by-second subjective feelings of emotion from low-level psychoacoustic features and physiological measurements.

Eduardo Coutinho1, Angelo Cangelosi.   

Abstract

We sustain that the structure of affect elicited by music is largely dependent on dynamic temporal patterns in low-level music structural parameters. In support of this claim, we have previously provided evidence that spatiotemporal dynamics in psychoacoustic features resonate with two psychological dimensions of affect underlying judgments of subjective feelings: arousal and valence. In this article we extend our previous investigations in two aspects. First, we focus on the emotions experienced rather than perceived while listening to music. Second, we evaluate the extent to which peripheral feedback in music can account for the predicted emotional responses, that is, the role of physiological arousal in determining the intensity and valence of musical emotions. Akin to our previous findings, we will show that a significant part of the listeners' reported emotions can be predicted from a set of six psychoacoustic features--loudness, pitch level, pitch contour, tempo, texture, and sharpness. Furthermore, the accuracy of those predictions is improved with the inclusion of physiological cues--skin conductance and heart rate. The interdisciplinary work presented here provides a new methodology to the field of music and emotion research based on the combination of computational and experimental work, which aid the analysis of the emotional responses to music, while offering a platform for the abstract representation of those complex relationships. Future developments may aid specific areas, such as, psychology and music therapy, by providing coherent descriptions of the emotional effects of specific music stimuli. 2011 APA, all rights reserved

Mesh:

Year:  2011        PMID: 21859207     DOI: 10.1037/a0024700

Source DB:  PubMed          Journal:  Emotion        ISSN: 1528-3542


  16 in total

1.  ECoG high gamma activity reveals distinct cortical representations of lyrics passages, harmonic and timbre-related changes in a rock song.

Authors:  Irene Sturm; Benjamin Blankertz; Cristhian Potes; Gerwin Schalk; Gabriel Curio
Journal:  Front Hum Neurosci       Date:  2014-10-13       Impact factor: 3.169

2.  Female Listeners' Autonomic Responses to Dramatic Shifts Between Loud and Soft Music/Sound Passages: A Study of Heavy Metal Songs.

Authors:  Tzu-Han Cheng; Chen-Gia Tsai
Journal:  Front Psychol       Date:  2016-02-17

3.  Identifying Core Affect in Individuals from fMRI Responses to Dynamic Naturalistic Audiovisual Stimuli.

Authors:  Jongwan Kim; Jing Wang; Douglas H Wedell; Svetlana V Shinkareva
Journal:  PLoS One       Date:  2016-09-06       Impact factor: 3.240

4.  Enhancement of Pleasure during Spontaneous Dance.

Authors:  Nicolò F Bernardi; Antoine Bellemare-Pepin; Isabelle Peretz
Journal:  Front Hum Neurosci       Date:  2017-11-29       Impact factor: 3.169

5.  Emotional Responses to Music: Shifts in Frontal Brain Asymmetry Mark Periods of Musical Change.

Authors:  Hussain-Abdulah Arjmand; Jesper Hohagen; Bryan Paton; Nikki S Rickard
Journal:  Front Psychol       Date:  2017-12-04

6.  Shared acoustic codes underlie emotional communication in music and speech-Evidence from deep transfer learning.

Authors:  Eduardo Coutinho; Björn Schuller
Journal:  PLoS One       Date:  2017-06-28       Impact factor: 3.240

Review 7.  Naturalistic Stimuli in Affective Neuroimaging: A Review.

Authors:  Heini Saarimäki
Journal:  Front Hum Neurosci       Date:  2021-06-17       Impact factor: 3.169

8.  Assessing musical abilities objectively: construction and validation of the profile of music perception skills.

Authors:  Lily N C Law; Marcel Zentner
Journal:  PLoS One       Date:  2012-12-28       Impact factor: 3.240

9.  Predicting musically induced emotions from physiological inputs: linear and neural network models.

Authors:  Frank A Russo; Naresh N Vempala; Gillian M Sandstrom
Journal:  Front Psychol       Date:  2013-08-08

10.  Identifying musical pieces from fMRI data using encoding and decoding models.

Authors:  Sebastian Hoefle; Annerose Engel; Rodrigo Basilio; Vinoo Alluri; Petri Toiviainen; Maurício Cagy; Jorge Moll
Journal:  Sci Rep       Date:  2018-02-02       Impact factor: 4.379

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.