| Literature DB >> 31216037 |
Annett Schirmer1,2,3, Maria Wijaya1, Esther Wu4, Trevor B Penney1,2,3.
Abstract
This pre-registered event-related potential study explored how vocal emotions shape visual perception as a function of attention and listener sex. Visual task displays occurred in silence or with a neutral or an angry voice. Voices were task-irrelevant in a single-task block, but had to be categorized by speaker sex in a dual-task block. In the single task, angry voices increased the occipital N2 component relative to neutral voices in women, but not men. In the dual task, angry voices relative to neutral voices increased occipital N1 and N2 components, as well as accuracy, in women and marginally decreased accuracy in men. Thus, in women, vocal anger produced a strong, multifaceted visual enhancement comprising attention-dependent and attention-independent processes, whereas in men, it produced a small, behavior-focused visual processing impairment that was strictly attention-dependent. In sum, these data indicate that attention and listener sex critically modulate whether and how vocal emotions shape visual perception.Entities:
Keywords: ERP; emotion; sex differences; visual attention; vocal affect
Year: 2019 PMID: 31216037 PMCID: PMC6778830 DOI: 10.1093/scan/nsz044
Source DB: PubMed Journal: Soc Cogn Affect Neurosci ISSN: 1749-5016 Impact factor: 3.436
Fig. 1Research paradigm.
Fig. 2Behavioral results. Mean d′ scores and reaction times are shown as a function of task, sound and sex. Error bars reflect the within-subject standard error.
Fig. 3ERP traces and maps. Mean ERP voltages were derived by separately averaging signals for left occipital electrodes (PO7, PO5 and O1), right occipital electrodes (PO8, PO6 and O2) and the voltage difference between contra- and ipsi-lateral occipital electrodes. Time windows for statistical analysis are marked by the shaded areas. Maps illustrate the mean voltages and condition differences for the statistical analysis windows.
Fig. 4ERP mean amplitudes. Mean voltages in the N1 and N2 analysis windows are shown as a function of task, sound, sex and region. Error bars reflect the within-subject standard error.