Literature DB >> 20058994

Major and minor music compared to excited and subdued speech.

Daniel L Bowling1, Kamraan Gill, Jonathan D Choi, Joseph Prinz, Dale Purves.   

Abstract

The affective impact of music arises from a variety of factors, including intensity, tempo, rhythm, and tonal relationships. The emotional coloring evoked by intensity, tempo, and rhythm appears to arise from association with the characteristics of human behavior in the corresponding condition; however, how and why particular tonal relationships in music convey distinct emotional effects are not clear. The hypothesis examined here is that major and minor tone collections elicit different affective reactions because their spectra are similar to the spectra of voiced speech uttered in different emotional states. To evaluate this possibility the spectra of the intervals that distinguish major and minor music were compared to the spectra of voiced segments in excited and subdued speech using fundamental frequency and frequency ratios as measures. Consistent with the hypothesis, the spectra of major intervals are more similar to spectra found in excited speech, whereas the spectra of particular minor intervals are more similar to the spectra of subdued speech. These results suggest that the characteristic affective impact of major and minor tone collections arises from associations routinely made between particular musical intervals and voiced speech.

Entities:  

Mesh:

Year:  2010        PMID: 20058994     DOI: 10.1121/1.3268504

Source DB:  PubMed          Journal:  J Acoust Soc Am        ISSN: 0001-4966            Impact factor:   1.840


  15 in total

1.  Enhanced brainstem encoding predicts musicians' perceptual advantages with pitch.

Authors:  Gavin M Bidelman; Ananthanarayan Krishnan; Jackson T Gandour
Journal:  Eur J Neurosci       Date:  2010-12-29       Impact factor: 3.386

2.  Musicians and tone-language speakers share enhanced brainstem encoding but not perceptual benefits for musical pitch.

Authors:  Gavin M Bidelman; Jackson T Gandour; Ananthanarayan Krishnan
Journal:  Brain Cogn       Date:  2011-08-10       Impact factor: 2.310

3.  Young Infants Match Facial and Vocal Emotional Expressions of Other Infants.

Authors:  Mariana Vaillant-Molina; Lorraine E Bahrick; Ross Flom
Journal:  Infancy       Date:  2013-08-01

4.  Human emotions track changes in the acoustic environment.

Authors:  Weiyi Ma; William Forde Thompson
Journal:  Proc Natl Acad Sci U S A       Date:  2015-11-09       Impact factor: 11.205

5.  Animal Pitch Perception: Melodies and Harmonies.

Authors:  Marisa Hoeschele
Journal:  Comp Cogn Behav Rev       Date:  2017

6.  Co-variation of tonality in the music and speech of different cultures.

Authors:  Shui' er Han; Janani Sundararajan; Daniel Liu Bowling; Jessica Lake; Dale Purves
Journal:  PLoS One       Date:  2011-05-27       Impact factor: 3.240

7.  Expression of emotion in Eastern and Western music mirrors vocalization.

Authors:  Daniel Liu Bowling; Janani Sundararajan; Shui'er Han; Dale Purves
Journal:  PLoS One       Date:  2012-03-14       Impact factor: 3.240

8.  Minor second intervals: A shared signature for infant cries and sadness in music.

Authors:  Gabriele Zeloni; Francesco Pavani
Journal:  Iperception       Date:  2022-04-18

9.  Emotional communication in speech and music: the role of melodic and rhythmic contrasts.

Authors:  Lena Quinto; William Forde Thompson; Felicity Louise Keating
Journal:  Front Psychol       Date:  2013-04-24

10.  A vocal basis for the affective character of musical mode in melody.

Authors:  Daniel L Bowling
Journal:  Front Psychol       Date:  2013-07-31
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.