Literature DB >> 16761812

Modeling emotional content of music using system identification.

Mark D Korhonen1, David A Clausi, M Ed Jernigan.   

Abstract

Research was conducted to develop a methodology to model the emotional content of music as a function of time and musical features. Emotion is quantified using the dimensions valence and arousal, and system-identification techniques are used to create the models. Results demonstrate that system identification provides a means to generalize the emotional content for a genre of music. The average R2 statistic of a valid linear model structure is 21.9% for valence and 78.4% for arousal. The proposed method of constructing models of emotional content generalizes previous time-series models and removes ambiguity from classifiers of emotion.

Mesh:

Year:  2006        PMID: 16761812     DOI: 10.1109/tsmcb.2005.862491

Source DB:  PubMed          Journal:  IEEE Trans Syst Man Cybern B Cybern        ISSN: 1083-4419


  2 in total

1.  Variability in prefrontal hemodynamic response during exposure to repeated self-selected music excerpts, a near-infrared spectroscopy study.

Authors:  Saba Moghimi; Larissa Schudlo; Tom Chau; Anne-Marie Guerguerian
Journal:  PLoS One       Date:  2015-04-02       Impact factor: 3.240

2.  Developing a benchmark for emotional analysis of music.

Authors:  Anna Aljanaki; Yi-Hsuan Yang; Mohammad Soleymani
Journal:  PLoS One       Date:  2017-03-10       Impact factor: 3.240

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.