| Literature DB >> 31736683 |
Yin Liang1, Baolin Liu2,3,4, Junzhong Ji1, Xianglin Li5.
Abstract
Emotions can be perceived from both facial and bodily expressions. Our previous study has found the successful decoding of facial expressions based on the functional connectivity (FC) patterns. However, the role of the FC patterns in the recognition of bodily expressions remained unclear, and no neuroimaging studies have adequately addressed the question of whether emotions perceiving from facial and bodily expressions are processed rely upon common or different neural networks. To address this, the present study collected functional magnetic resonance imaging (fMRI) data from a block design experiment with facial and bodily expression videos as stimuli (three emotions: anger, fear, and joy), and conducted multivariate pattern classification analysis based on the estimated FC patterns. We found that in addition to the facial expressions, bodily expressions could also be successfully decoded based on the large-scale FC patterns. The emotion classification accuracies for the facial expressions were higher than that for the bodily expressions. Further contributive FC analysis showed that emotion-discriminative networks were widely distributed in both hemispheres, containing regions that ranged from primary visual areas to higher-level cognitive areas. Moreover, for a particular emotion, discriminative FCs for facial and bodily expressions were distinct. Together, our findings highlight the key role of the FC patterns in the emotion processing, indicating how large-scale FC patterns reconfigure in processing of facial and bodily expressions, and suggest the distributed neural representation for the emotion recognition. Furthermore, our results also suggest that the human brain employs separate network representations for facial and bodily expressions of the same emotions. This study provides new evidence for the network representations for emotion perception and may further our understanding of the potential mechanisms underlying body language emotion recognition.Entities:
Keywords: bodily expressions; facial expressions; functional connectivity; functional magnetic resonance imaging; multivariate pattern classification
Year: 2019 PMID: 31736683 PMCID: PMC6828617 DOI: 10.3389/fnins.2019.01111
Source DB: PubMed Journal: Front Neurosci ISSN: 1662-453X Impact factor: 4.677
FIGURE 1Exemplar stimuli and schematic representation of the experiment paradigm. (A) Exemplar facial and bodily expression stimuli. All emotion stimuli were taken from the GEMEP database. Videos of faces and bodies displaying three emotions (anger, fear, and joy) were used in the experiment. (B) Paradigm of the experiment design. A cross was presented for 10 s before each block, and then eight emotion stimuli appeared. Subsequently, the participants completed a button task to indicate their discrimination of the emotion category they had seen in the previous block.
FIGURE 2Flowchart of the experiment and data analysis procedure. (A) Experiment and fMRI data acquisition. (B) Brainnetome atlas for network nodes definition. (C) Framework overview of the fcMVPA. Estimation of the FC patterns was carried out using CONN toolbox. Before the FC computing, BOLD time series were denoising to further remove unwanted motion and physiological and other artifactual effects. Then, the whole-brain FC patterns for each emotion were constructed using ROI-to-ROI analysis. Emotion classification was performed in a leave-one-subject-out cross-validation scheme with an SVM classifier. Multi-category and pairwise emotion classifications for the facial and bodily expressions were conducted. Emotion-preferring networks were constructed based on the discriminative FCs.
Behavioral accuracies and reaction times for facial and bodily expressions [mean% and standard deviation (SD)].
| Facial expressions | Joy | 100 | 0 | 713.74 | 163.39 |
| Anger | 97.5 | 6.11 | 808.22 | 235.30 | |
| Fear | 96.67 | 6.84 | 809.54 | 234.73 | |
| Total | 98.06 | 3.73 | 777.17 | 200.75 | |
| Bodily expressions | Joy | 97.5 | 6.11 | 669.36 | 160.87 |
| Anger | 97.5 | 6.11 | 762.83 | 227.69 | |
| Fear | 96.67 | 6.84 | 825.16 | 220.20 | |
| Total | 97.22 | 4.60 | 752.45 | 188.85 | |
Head motion parameters for different emotion categories (mean and SD).
| Facial expressions | Joy | 0.15 (0.12) | 0.06 (0.03) | 0.27 (0.18) | 0.25 (0.14) | 0.15 (0.09) | 0.13 (0.10) |
| Anger | 0.13 (0.09) | 0.06 (0.02) | 0.23 (0.16) | 0.24 (0.16) | 0.13 (0.07) | 0.11 (0.07) | |
| Fear | 0.16 (0.11) | 0.07 (0.03) | 0.29 (0.21) | 0.29 (0.19) | 0.16 (0.09) | 0.14 (0.11) | |
| Bodily expressions | Joy | 0.16 (0.11) | 0.06 (0.03) | 0.27 (0.18) | 0.26 (0.14) | 0.15 (0.08) | 0.14 (0.10) |
| Anger | 0.16 (0.12) | 0.07 (0.03) | 0.26 (0.15) | 0.25 (0.14) | 0.15 (0.08) | 0.14 (0.10) | |
| Fear | 0.14 (0.10) | 0.06 (0.03) | 0.23 (0.17) | 0.24 (0.15) | 0.13 (0.07) | 0.12 (0.09) | |
Accuracies of decoding facial and bodily expressions using fcMVPA.
| 56.67%* | 53.33%* | 43.33%* | 46.67%* | |
| Anger–Fear | 70%∗ | 72.5%* | 62.5%* | 72.5%* |
| Anger–Joy | 60%∗ | 60%∗ | 52.5% | 70%∗ |
| Fear–Joy | 77.5%* | 80%∗ | 70%∗ | 75%∗ |
FIGURE 3The changes of multi-category classification accuracies for facial and bodily expressions when different number of FC features were used.
FIGURE 4Most discriminative FCs for pairwise facial expression classifications. Results are mapped onto the cortical surfaces using BrainNet Viewer. The coordinates of each node are according to the Brainnetome atlas, and the brain regions are scaled by the number of their connections. The connectogram is created using Circos. Different colors are used to indicate different modules (the frontal, temporal, parietal, insula, limbic and occipital lobes as well as the subcortical nuclei) according to the Brainnetome atlas. Lines of the intra-module connections are represented by the same color as the located module, while the inter-module connections are represented by gray lines.
FIGURE 5Most discriminative FCs for pairwise bodily expression classifications. Results are mapped onto the cortical surfaces using BrainNet Viewer. The coordinates of each node are according to the Brainnetome atlas and the brain regions are scaled by the number of their connections. The connectogram is created using Circos. Different colors are used to indicate different modules (the frontal, temporal, parietal, insula, limbic, and occipital lobes as well as the subcortical nuclei) according to the Brainnetome atlas. Lines of the intra-module connections are represented by the same color as the located module, while the inter-module connections are represented by gray lines.
Discriminative FCs for the facial and bodily expressions.
| 1 | 48 | –70 | –1 | R Lateral Occipital Cortex | 58 | –16 | –10 | R Middle Temporal Gyrus |
| 2 | 8 | 58 | 13 | R Superior Frontal Gyrus | –4 | –23 | 61 | L Paracentral Lobule |
| 3 | 48 | 35 | 13 | R Inferior Frontal Gyrus | –51 | –33 | 42 | L Inferior Parietal Lobule |
| 4 | –6 | –5 | 58 | L Superior Frontal Gyrus | –18 | 24 | 53 | L Superior Frontal Gyrus |
| 1 | –7 | –23 | 41 | L Cingulate Gyrus | –41 | 41 | 16 | L Middle Frontal Gyrus |
| 2 | 6 | –20 | 40 | R Cingulate Gyrus | –36 | –20 | 10 | L Insular Gyrus |
| 3 | –11 | –82 | –11 | L MedioVentral Occipital Cortex | –27 | –59 | 54 | L Superior Parietal Lobule |
| 4 | –28 | –30 | –10 | L Hippocampus | –25 | –25 | –26 | L Parahippocampal Gyrus |
| 1 | –27 | –4 | –20 | L Amygdala | –33 | –16 | –32 | L Fusiform Gyrus |
| 2 | 29 | –75 | 36 | R Lateral Occipital Cortex | 31 | –54 | 53 | R Superior Parietal Lobule |
| 3 | 57 | –40 | 12 | R Posterior Superior Temporal Sulcus | 7 | –76 | 11 | R MedioVentral Occipital Cortex |
| 4 | 57 | –40 | 12 | R Posterior Superior Temporal Sulcus | –5 | –81 | 10 | L MedioVentral Occipital Cortex |
| 5 | –62 | –33 | 7 | L Superior Temporal Gyrus | –52 | –32 | 12 | L Superior Temporal Gyrus |
| 6 | 54 | 24 | 12 | R Inferior Frontal Gyrus | 48 | 35 | 13 | R Inferior Frontal Gyrus |
| 7 | 54 | 24 | 12 | R Inferior Frontal Gyrus | 45 | 16 | 25 | R Inferior Frontal Gyrus |
| 1 | –54 | –40 | 4 | L Posterior Superior Temporal Sulcus | 42 | 22 | 3 | R Inferior Frontal Gyrus |
| 2 | –33 | –47 | 50 | L Superior Parietal Lobule | –28 | 56 | 12 | L Middle Frontal Gyrus |
| 3 | –27 | –59 | 54 | L Superior Parietal Lobule | –49 | 36 | –3 | L Inferior Frontal Gyrus |
| 4 | –16 | –24 | 6 | L Thalamus | –18 | –23 | 4 | L Thalamus |
| 1 | 19 | –2 | –19 | R Amygdala | 9 | 20 | –19 | R Orbital Gyrus |
| 2 | 7 | –76 | 11 | R MedioVentral Occipital Cortex | 10 | –85 | –9 | R MedioVentral Occipital Cortex |
| 3 | –15 | –71 | 52 | L Superior Parietal Lobule | 42 | 44 | 14 | R Middle Frontal Gyrus |
| 4 | –22 | –47 | 65 | L Superior Parietal Lobule | –16 | –60 | 63 | L Superior Parietal Lobule |
| 1 | 34 | 8 | 54 | R Middle Frontal Gyrus | 20 | 4 | 64 | R Superior Frontal Gyrus |
| 2 | 51 | –4 | –1 | R Superior Temporal Gyrus | 56 | –10 | 15 | R Postcentral Gyrus |
| 3 | –18 | –99 | 2 | L Lateral Occipital Cortex | –6 | –94 | 1 | L MedioVentral Occipital Cortex |
| 4 | –18 | –99 | 2 | L Lateral Occipital Cortex | –46 | –74 | 3 | L Lateral Occipital Cortex |
| 5 | 22 | –97 | 4 | R Lateral Occipital Cortex | –6 | –94 | 1 | L MedioVentral Occipital Cortex |
| 6 | 29 | –75 | 36 | R Lateral Occipital Cortex | –27 | –59 | 54 | L Superior Parietal Lobule |
| 7 | –30 | –88 | –12 | L Lateral Occipital Cortex | –46 | –74 | 3 | L Lateral Occipital Cortex |
| 8 | –30 | –88 | –12 | L Lateral Occipital Cortex | –6 | –94 | 1 | L MedioVentral Occipital Cortex |