| Literature DB >> 27280154 |
Philip A Kragel1, Kevin S LaBar1.
Abstract
Studies of human emotion perception have linked a distributed set of brain regions to the recognition of emotion in facial, vocal, and body expressions. In particular, lesions to somatosensory cortex in the right hemisphere have been shown to impair recognition of facial and vocal expressions of emotion. Although these findings suggest that somatosensory cortex represents body states associated with distinct emotions, such as a furrowed brow or gaping jaw, functional evidence directly linking somatosensory activity and subjective experience during emotion perception is critically lacking. Using functional magnetic resonance imaging and multivariate decoding techniques, we show that perceiving vocal and facial expressions of emotion yields hemodynamic activity in right somatosensory cortex that discriminates among emotion categories, exhibits somatotopic organization, and tracks self-reported sensory experience. The findings both support embodied accounts of emotion and provide mechanistic insight into how emotional expressions are capable of biasing subjective experience in those who perceive them.Entities:
Keywords: embodied cognition; emotion; functional magnetic resonance imaging; perception; somatosensation
Mesh:
Substances:
Year: 2016 PMID: 27280154 PMCID: PMC4894916 DOI: 10.1523/ENEURO.0090-15.2016
Source DB: PubMed Journal: eNeuro ISSN: 2373-2822
Statistical table
| a | Classification accuracy: self-report; average vs chance | Binomial | Wilcoxon sign-rank test (against constant) | 0.9620045 |
| b | Classification accuracy: right postcentral gyrus; average vs chance | Binomial | Wilcoxon sign-rank test (against constant) | 0.9565211 |
| c | Classification accuracy: insula; average vs chance | Binomial | Wilcoxon sign-rank test (against constant) | 0.8235641 |
| d | Classification accuracy: medial OFC; average vs chance | Binomial | Wilcoxon sign-rank test (against constant) | 0.5337574 |
| e | Classification accuracy: inferior frontal operculum; average vs chance | Binomial | Wilcoxon sign-rank test (against constant) | 0.5378919 |
| f | Classification accuracy: fusiform gyrus; average vs chance | Binomial | Wilcoxon sign-rank test (against constant) | 0.7418888 |
| g | Classification accuracy (objective labels): amygdala; average vs chance | Binomial | Wilcoxon sign-rank test (against constant) | 0.3522203* |
| h | Classification accuracy (objective labels): posterior STS; average vs chance | Binomial | Wilcoxon sign-rank test (against constant) | 0.4092183* |
| i | Classification accuracy (objective labels): postcentral gyrus; facial vs vocal | Binomial | Wilcoxon sign-rank test (paired) | 0.0789112* |
| j | Classification accuracy (objective labels): insula; facial vs vocal | Binomial | Wilcoxon sign-rank test (paired) | 0.6379089* |
| k | Classification accuracy (objective labels): medial OFC; facial vs vocal | Binomial | Wilcoxon sign-rank test (paired) | 0.130852* |
| l | Classification accuracy (objective labels): inferior frontal operculum; facial vs vocal | Binomial | Wilcoxon sign-rank test (paired) | 0.7368259* |
| m | Classification accuracy (objective labels): fusiform gyrus; facial vs vocal | Binomial | Wilcoxon sign-rank test (paired) | 0.0578589* |
| n | Classification accuracy (objective labels): amygdala; facial vs vocal | Binomial | Wilcoxon sign-rank test (paired) | 0.0532535* |
| o | Classification accuracy (objective labels): posterior STS; facial vs vocal | Binomial | Wilcoxon sign-rank test (paired) | 0.0535096* |
| p | Classification accuracy (objective labels): right postcentral gyrus vs insula | Binomial | Wilcoxon sign-rank test (paired) | 0.1202871* |
| q | Classification accuracy (objective labels): right postcentral gyrus vs medial OFC | Binomial | Wilcoxon sign-rank test (paired) | 0.1961514* |
| r | Classification accuracy (objective labels): right postcentral gyrus vs IFO | Binomial | Wilcoxon sign-rank test (paired) | 0.4217805* |
| s | Classification accuracy (objective labels): right postcentral gyrus vs fusiform gyrus | Binomial | Wilcoxon sign-rank test (paired) | 0.5747844* |
| t | Classification accuracy (objective labels): right postcentral gyrus vs amygdala | Binomial | Wilcoxon sign-rank test (paired) | 0.6903970* |
| u | Classification accuracy (objective labels): right postcentral gyrus vs posterior STS | Binomial | Wilcoxon sign-rank test (paired) | 0.8736939* |
| v | Classification accuracy (objective labels): right vs left postcentral gyrus | Binomial | Wilcoxon sign-rank test (paired) | 0.6693379 |
| w | PLS regression coefficients (objective labels): upper vs lower face emotions | Normal | One-sample | 0.9999976 |
| x | PLS regression coefficients (objective labels): upper vs lower face emotions | Normal | One-sample | 1.0000000 |
| y | Classification accuracy (objective labels): self-report against right postcentral gyrus | Binomial | Correlation (Pearson) | 0.9092103 |
| z | Classification accuracy (subjective labels): right postcentral gyrus | Binomial | Wilcoxon sign-rank test (against constant) | 0.9974600 |
| aa | Classification accuracy: objective vs subjective labels, right postcentral gyrus | Binomial | Wilcoxon sign-rank test (paired) | 0.0500000 |
| bb | PLS regression coefficients: objective against subjective models | Binomial | One-sample | 0.9999025 |
| cc | Classification accuracy: left precentral gyrus; average vs chance | Binomial | Wilcoxon sign-rank test (against constant) | 0.0684199 |
| dd | Classification accuracy: self-report against left precentral gyrus | Binomial | Correlation (Pearson) | 0.3588166 |
| ee | Classification accuracy: postcentral gyrus; average vs chance | Binomial | Wilcoxon sign-rank test (against constant) | 0.9830124 |
| ff | Classification accuracy: self-report against right precentral gyrus | Binomial | Correlation (Pearson) | 0.2688632 |
Data are assumed to come from the stated distributions. For sign-rank tests, effect sizes are computed as r = z/√n, from which achieved power is calculated. *Effects that were not significant (correcting for multiple comparisons) and were not included in the main text of the paper.
Figure 1.Experimental paradigm and behavioral results. , Graphical depiction of a single trial in which participants are first presented a facial or vocal expression of emotion, followed by a fixation cross, and a response screen, which subjects used to indicate their own emotional experience in response to the stimuli by moving a cursor. , Scatterplot of behavioral responses for all participants, with each point corresponding to a single trial. Axes reflect cursor positions along horizontal and vertical dimensions of the screen, standardized within subjects. , Parametric maps (one sample t test, n = 21) of support vector machine decision values for each emotion category, showing which coordinates lead to the prediction of each emotion. Cursors located in blue regions are evidence against the labeled category, whereas red regions indicate positively predictive regions. , Confusion matrix for classification of self-report. Color bar indicates proportion of trials (chance = 16.67%) from each emotion category (rows) assigned each label during classification (columns).
Figure 2.Multivoxel pattern classification of BOLD response to facial and vocal expressions of emotion. , ROIs rendered on the group mean anatomical image (n = 21). , Patterns of response within right postcentral gyrus (z = 3.21, padj = 0.0047)b, insula (z = 2.66, padj = 0.0136)c, mOFC (z = 1.92, padj = 0.0384)d, IFO (z = 1.93, padj = 0.0384)e, and FG (z = 2.43, padj = 0.0175)f were classified at levels greater than chance (Wilcoxon sign-rank test). Dashed line reflects chance accuracy (16.67%). Error bars reflect SEM. ACC = accuracy.
Figure 3.Emotion-predictive patterns are consistent with known somatotopy. , Contrasts of classification weights reveal the perception of expressions associated with lower portions of the face was predicted by greater activation in inferior regions of the postcentral gyrus. Solid lines demarcate borders of BAs 3, 1, and 2. Text overlays indicate hypothesized somatotopy from upper to lower regions of the face. Inset of facial images convey portions of the face that are diagnostic of each expression (adapted with permission from Smith et al. 2005). , Contrasts of parameter estimates show that activation near the lateral sulcus selectively predicts expressions of happiness and surprise (lower face emotions) relative to fear and anger (upper face emotions). Error bars reflect 95% confidence intervals based on within-subject error (Cousineau, 2005). , Mean confusion matrix depicts classifications based on somatosensory data (columns) against true class labels (rows). Higher values along the main diagonal illustrate above-chance performance (chance = 16.67%). Confusions between happiness and surprise are consistent with somatotopic patterning driven by activity associated with lower portions of the face and mouth. Color bar indicates proportion of predictions (rows sum to one).
Figure 4.The information content of response patterns within right postcentral gyrus increases with the separability of self-report. , Scatterplot depicts cross-validated estimates of accuracy across all emotion categories for classification of self-report and neural data, with each point corresponding to a single subject (n = 21). Solid black line indicates the best least-squares fit to the data. Dashed lines reflect chance accuracy (16.67%). , Histogram of bootstrap distribution of Pearson’s correlation coefficient, with dashed lines indicating 95% confidence interval computed using the bias corrected and accelerated percentile method. ACC = accuracy.
Correlations between neural and self-report classification accuracy
| Right postcentral gyrus | 0.593 | 0.041 | 0.005 | −21.020 | 0.000 | 0.833 |
| pSTS | 0.420 | 0.260 | 0.058 | −15.996 | 5.024 | 0.068 |
| mOFC | 0.219 | 0.510 | 0.340 | −12.944 | 8.076 | 0.015 |
| IFO | 0.257 | 0.510 | 0.261 | −13.347 | 7.673 | 0.018 |
| FG | 0.085 | 0.802 | 0.713 | −12.066 | 8.954 | 0.009 |
| Amygdala | 0.128 | 0.747 | 0.581 | −12.257 | 8.763 | 0.010 |
| Insula | 0.051 | 0.826 | 0.826 | −11.967 | 9.053 | 0.009 |
| Left precentral gyrus | 0.287 | 0.510 | 0.208 | −13.712 | 7.308 | 0.022 |
| Right precentral gyrus | 0.233 | 0.510 | 0.309 | −13.085 | 7.936 | 0.016 |