Literature DB >> 20160287

Recognizing emotions in spoken language: a validated set of Portuguese sentences and pseudosentences for research on emotional prosody.

São Luís Castro1, César F Lima.   

Abstract

A set of semantically neutral sentences and derived pseudosentences was produced by two native European Portuguese speakers varying emotional prosody in order to portray anger, disgust, fear, happiness, sadness, surprise, and neutrality. Accuracy rates and reaction times in a forced-choice identification of these emotions as well as intensity judgments were collected from 80 participants, and a database was constructed with the utterances reaching satisfactory accuracy (190 sentences and 178 pseudosentences). High accuracy (mean correct of 75% for sentences and 71% for pseudosentences), rapid recognition, and high-intensity judgments were obtained for all the portrayed emotional qualities. Sentences and pseudosentences elicited similar accuracy and intensity rates, but participants responded to pseudosentences faster than they did to sentences. This database is a useful tool for research on emotional prosody, including cross-language studies and studies involving Portuguese-speaking participants, and it may be useful for clinical purposes in the assessment of brain-damaged patients. The database is available for download from http://brm.psychonomic-journals.org/content/supplemental.

Entities:  

Mesh:

Year:  2010        PMID: 20160287     DOI: 10.3758/BRM.42.1.74

Source DB:  PubMed          Journal:  Behav Res Methods        ISSN: 1554-351X


  7 in total

1.  Near-infrared spectroscopy reveals neural perception of vocal emotions in human neonates.

Authors:  Dandan Zhang; Yu Chen; Xinlin Hou; Yan Jing Wu
Journal:  Hum Brain Mapp       Date:  2019-01-30       Impact factor: 5.038

2.  The Mandarin Chinese auditory emotions stimulus database: A validated set of Chinese pseudo-sentences.

Authors:  Bingyan Gong; Na Li; Qiuhong Li; Xinyuan Yan; Jing Chen; Liang Li; Xihong Wu; Chao Wu
Journal:  Behav Res Methods       Date:  2022-05-31

3.  The Nencki Affective Picture System (NAPS): introduction to a novel, standardized, wide-range, high-quality, realistic picture database.

Authors:  Artur Marchewka; Łukasz Zurawski; Katarzyna Jednoróg; Anna Grabowska
Journal:  Behav Res Methods       Date:  2014-06

4.  How Psychological Stress Affects Emotional Prosody.

Authors:  Silke Paulmann; Desire Furnes; Anne Ming Bøkenes; Philip J Cozzolino
Journal:  PLoS One       Date:  2016-11-01       Impact factor: 3.240

5.  Mapping language with resting-state functional magnetic resonance imaging: A study on the functional profile of the language network.

Authors:  Paulo Branco; Daniela Seixas; São L Castro
Journal:  Hum Brain Mapp       Date:  2019-10-14       Impact factor: 5.038

6.  The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English.

Authors:  Steven R Livingstone; Frank A Russo
Journal:  PLoS One       Date:  2018-05-16       Impact factor: 3.240

7.  Hearing Aids Benefit Recognition of Words in Emotional Speech but Not Emotion Identification.

Authors:  Huiwen Goy; M Kathleen Pichora-Fuller; Gurjit Singh; Frank A Russo
Journal:  Trends Hear       Date:  2018 Jan-Dec       Impact factor: 3.293

  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.