Literature DB >> 26544602

Music-induced emotions can be predicted from a combination of brain activity and acoustic features.

Ian Daly1, Duncan Williams2, James Hallowell3, Faustina Hwang3, Alexis Kirke2, Asad Malik3, James Weaver3, Eduardo Miranda2, Slawomir J Nasuto3.   

Abstract

It is widely acknowledged that music can communicate and induce a wide range of emotions in the listener. However, music is a highly-complex audio signal composed of a wide range of complex time- and frequency-varying components. Additionally, music-induced emotions are known to differ greatly between listeners. Therefore, it is not immediately clear what emotions will be induced in a given individual by a piece of music. We attempt to predict the music-induced emotional response in a listener by measuring the activity in the listeners electroencephalogram (EEG). We combine these measures with acoustic descriptors of the music, an approach that allows us to consider music as a complex set of time-varying acoustic features, independently of any specific music theory. Regression models are found which allow us to predict the music-induced emotions of our participants with a correlation between the actual and predicted responses of up to r=0.234,p<0.001. This regression fit suggests that over 20% of the variance of the participant's music induced emotions can be predicted by their neural activity and the properties of the music. Given the large amount of noise, non-stationarity, and non-linearity in both EEG and music, this is an encouraging result. Additionally, the combination of measures of brain activity and acoustic features describing the music played to our participants allows us to predict music-induced emotions with significantly higher accuracies than either feature type alone (p<0.01).
Copyright © 2015 Elsevier Inc. All rights reserved.

Entities:  

Keywords:  Acoustic features; Affective state prediction; EEG; Machine learning; Music

Mesh:

Year:  2015        PMID: 26544602     DOI: 10.1016/j.bandc.2015.08.003

Source DB:  PubMed          Journal:  Brain Cogn        ISSN: 0278-2626            Impact factor:   2.310


  5 in total

1.  High-Order Areas and Auditory Cortex Both Represent the High-Level Event Structure of Music.

Authors:  Jamal A Williams; Elizabeth H Margulis; Samuel A Nastase; Janice Chen; Uri Hasson; Kenneth A Norman; Christopher Baldassano
Journal:  J Cogn Neurosci       Date:  2022-03-05       Impact factor: 3.420

Review 2.  A Systematic Review for Human EEG Brain Signals Based Emotion Classification, Feature Extraction, Brain Condition, Group Comparison.

Authors:  Mohamed Hamada; B B Zaidan; A A Zaidan
Journal:  J Med Syst       Date:  2018-07-24       Impact factor: 4.460

3.  Exploring Frequency-Dependent Brain Networks from Ongoing EEG Using Spatial ICA During Music Listening.

Authors:  Yongjie Zhu; Chi Zhang; Hanna Poikonen; Petri Toiviainen; Minna Huotilainen; Klaus Mathiak; Tapani Ristaniemi; Fengyu Cong
Journal:  Brain Topogr       Date:  2020-03-02       Impact factor: 3.020

4.  Neural and physiological data from participants listening to affective music.

Authors:  Ian Daly; Nicoletta Nicolaou; Duncan Williams; Faustina Hwang; Alexis Kirke; Eduardo Miranda; Slawomir J Nasuto
Journal:  Sci Data       Date:  2020-06-15       Impact factor: 6.444

5.  Dual-Threshold-Based Microstate Analysis on Characterizing Temporal Dynamics of Affective Process and Emotion Recognition From EEG Signals.

Authors:  Jing Chen; Haifeng Li; Lin Ma; Hongjian Bo; Frank Soong; Yaohui Shi
Journal:  Front Neurosci       Date:  2021-07-14       Impact factor: 4.677

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.