Literature DB >> 27672176

Neural substrates of the ability to recognize facial expressions: a voxel-based morphometry study.

Shota Uono1, Wataru Sato1, Takanori Kochiyama2, Reiko Sawada1,3, Yasutaka Kubota4, Sayaka Yoshimura1, Motomi Toichi1,3.   

Abstract

The recognition of facial expressions of emotion is adaptive for human social interaction, but the ability to do this and the manner in which it is achieved differs among individuals. Previous functional neuroimaging studies have demonstrated that some brain regions, such as the inferior frontal gyrus (IFG), are active during the response to emotional facial expressions in healthy participants, and lesion studies have demonstrated that damage to these structures impairs the recognition of facial expressions. However, it remains to be established whether individual differences in the structure of these regions could be associated with differences in the ability to recognize facial expressions. We investigated this issue using acquired structural magnetic resonance imaging, and assessed the performance of healthy adults with respect to recognition of the facial expressions of six basic emotions. The gray matter volume of the right IFG positively correlated with the total accuracy of facial expression recognition. This suggests that individual differences in the ability to recognize facial expressions are associated with differences in the structure of the right IFG. Furthermore, the mirror neuron activity of the IFG may be important for establishing efficient facial mimicry to facilitate emotion recognition.
© The Author (2016). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

Entities:  

Keywords:  cerebellum; facial expression recognition; inferior frontal gyrus; superior temporal gyrus; voxel-based morphometry

Mesh:

Year:  2017        PMID: 27672176      PMCID: PMC5390731          DOI: 10.1093/scan/nsw142

Source DB:  PubMed          Journal:  Soc Cogn Affect Neurosci        ISSN: 1749-5016            Impact factor:   3.436


Introduction

Facial expressions are indispensable for effective social interactions. Expressions make it possible to understand another’s emotional state, communicate intention (Keltner and Kring, 1998) and trigger appropriate behavior (Blair, 2003). Consistent with this adaptive functioning, previous studies have demonstrated that the ability to recognize facial expressions accurately has a positive effect on functional outcomes, such as social adjustment (e.g. Edwards ) and mental health (e.g. Carton ). Accordingly, people with impaired emotion recognition, such as individuals with autism spectrum disorder (ASD) and schizophrenia, have marked difficulties in this regard (Mancuso ; Uono , 2013). These findings have drawn attention to the need to understand the underlying neurocognitive mechanisms of the ability to recognize facial expressions. Some evidence suggests that the large individual differences in the ability to recognize facial expressions, which is stable over time, relate to differences in the underlying brain structures. Several studies have reported that individuals with ASD and schizophrenia show marked difficulties recognizing facial expressions compared with normal participants (Kohler ; Uljarevic and Hamilton, 2013; Kret and Ploeger, 2015) and have genetically inherited atypical brain structures (Crespi and Badcock, 2008 for a review). Many studies have noted gender differences in the ability to recognize facial expressions (Kret and de Gelder, 2012), and in brain structures involved in social behavior (Ruigrok ). A recent study revealed that individual differences in expression recognition may be related to certain genotypes involved in the development of brain structures (Lin ). Given that there is wide individual variation in facial expression processing, even in the typical population (Palermo ), there may also be variation in the brain structure of the same population. A number of neuroimaging studies that used functional magnetic resonance imaging (fMRI) and positron emission tomography have provided clues that suggest which brain regions are crucial for recognizing facial expressions. Two recent meta-analyses indicated that the processing of emotional facial expressions involves prefrontal regions, such as the inferior frontal gyrus (IFG), the superior temporal sulcus (STS) region (Allison ) including the posterior STS and the adjacent middle temporal gyrus (MTG) and superior temporal gyrus (STG), and the amygdala (Fusar-Poli ; Sabatinelli ). These regions may be involved in the understanding of another’s intentions, through matching the visual representation of the other’s action with one’s own motor representations of the action (IFG: Rizzolatti ; Gallese ), the visual analysis of invariant aspects of faces such as expressions (the STS region: Haxby ), and the extraction of emotional information from stimuli (amygdala: Calder ). The recognition of another’s facial expressions could be implemented by motor, visual and emotional processing in the IFG, the STS region and the amygdala. Previous studies have revealed that damage or interruption to these brain regions impairs recognition of overall emotional facial expressions. A study showing anatomical double dissociations between cognitive and emotional empathy demonstrated that damage to the right IFG decreases the ability to recognize emotional expressions from the region around eyes (Shamay-Tsoory ). Lesion symptom mapping studies have reported that lesions in the bilateral frontal and temporal cortex, including the IFG and STS region as well as other regions, contribute to understanding others’ facial expressions (Adolphs ; Dal Monte ). A recent transcranial magnetic stimulation (TMS) study also showed that a transient disruption of the STS region impairs facial expression recognition but not identity recognition (Pitcher, 2014). Other studies have revealed that damage to specific brain regions impairs the ability to recognize specific emotions, such as in the case of impaired recognition of certain negative facial expressions, especially fear, in patients with amygdala damage (e.g. Adolphs , 1999). Taken together, these studies have demonstrated that some brain regions are active in the response to emotional facial expressions, and that damage to these structures impairs this ability. However, it remains to be proven whether individual differences in the structure of these regions could be associated with differences recognition ability in healthy participants. Research on potential associations between expression recognition and brain structure will provide insight into the roles that identified regions and associated cognitive processes play in determining individual differences in the ability to recognize facial expressions in typical and atypical individuals. We investigated this issue by conducting voxel-based morphometry (VBM) on structural MRI data and assessing the performance of healthy adults with respect to recognition of the facial expressions of six basic emotions. VBM studies in healthy individuals have demonstrated that individual differences in task performance are reflected in the volumes of specific brain regions (e.g. Carlson ; Takeuchi ; Gilaie-Dotan ). This approach enables the investigation of the relationship between task performance and structures throughout the entire brain; the areas under investigation are not restricted to those activated under a specific task or those damaged in patients. The ability to recognize facial expressions was measured using a label-matching paradigm featuring six basic emotions. This paradigm has previously revealed differences in the ability to recognize facial expressions between typical and clinical populations (e.g. Uono , 2013; Sato ; Okada ). Based on previous functional neuroimaging and lesion studies, we hypothesized that the gray and white matter volume of the IFG, the STS region and the amygdala would correlate with the ability to recognize facial expressions.

Materials and methods

Participants

Fifty healthy young Japanese adults participated in this study (24 females and 26 males; M ± s.d. age, 22.4 ± 4.4). One additional participant was excluded from the analysis, because she was very familiar with the stimuli used. Verbal and performance intelligence quotient (IQ) was measured using the Japanese version of the Wechsler Adult Intelligence Scale, third edition (Fujita ). All participants had IQs within the normal range (full scale IQ: M = 121.4, s.d. = 8.6; verbal IQ: M = 121.5, s.d. = 9.3; performance IQ: M = 116.7, s.d. = 10.4). Based on the Japanese version of the Mini International Neuropsychiatric Interview (Otsubo ), a psychiatrist confirmed that none of participants had any neurological or psychiatric symptoms at a clinical level. All participants were right-handed, as assessed by the Edinburgh Handedness Inventory (Oldfield, 1971), and all had normal or corrected-to-normal visual acuity. Following an explanation of the procedures, all participants provided written informed consent. This study was part of a broad research project exploring mind–brain relationships. The project was approved by the ethics committee of the Primate Research Institute, Kyoto University. The experiment was conducted in accordance with the guidelines of the Declaration of Helsinki.

Emotion recognition task

A total of 48 photographs of faces expressing the 6 basic emotions (anger, disgust, fear, happiness, sadness and surprise) from 4 Caucasian and 4 Japanese individuals were used as stimuli (Ekman and Friesen, 1976; Matsumoto and Ekman, 1988). The experiment was conducted using the Presentation (version 14.9, Neurobehavioral System) software on a Windows computer (HPZ200SFF, Hewlett-Packard Company). The images were presented on a 19-in. CRT monitor (HM903D-A, Iiyama) in random order. Written labels of the six basic emotions were presented around each photograph and the positions of the labels were counterbalanced across blocks. Participants were asked to indicate which of the labels best described the emotion expressed in each photograph. They were instructed to consider all alternatives prior to responding. Thus, time limits were not set, and each photograph remained on the screen until a verbal response was made. An experimenter carefully recorded the verbal response. Feedback to their response was not provided for each trial, and the photographs were presented just once. The participants completed a total of 48 trials in approximately 10 min. We confirmed that all participants understood the meanings of the written labels prior to starting the experimental trials. They also performed two training trials to familiarize themselves with the procedure.

MRI acquisition

The emotion recognition task and MRI acquisition were conducted on separate days. Image scanning was performed on a 3-T scanning system (MAGNETOM Trio, A Tim System, Siemens) at the ATR Brain Activity Imaging Center using a 12-channel array coil. T1-weighted high-resolution anatomical images were obtained using a magnetization-prepared rapid gradient-echo sequence (repetition time = 2250 ms, echo time = 3.06 ms, flip angle = 9°, inversion time = 1000 ms, field of view = 256 × 256 mm, matrix size = 256 × 256, voxel size = 1 × 1 × 1 mm).

Data analysis

Emotion recognition task. The percent accuracy in each emotion category and the average score across categories were calculated. Image analysis. Image and statistical analyses were performed using the SPM8 statistical parametric mapping package (http://www.fil.ion.ucl.ac.uk/spm) and the VBM8 toolbox (http://dbm.neuro.uni-jena.de) implemented in MATLAB R2012b (Mathworks). Image preprocessing was performed using the VBM8 toolbox by using the default settings. Structural T1 images were segmented into gray matter, white matter and cerebrospinal fluid, using an adaptive maximum a posteriori approach (Rajapakse ). Intensity inhomogeneity in the MRI was modeled as slowly varying spatial functions, and thus corrected in the estimation. The segmented images were used for a partial volume estimation using a simple model with mixed tissue types to improve segmentation (Tohka ). A spatially adaptive non-local means denoising filter was applied to address spatially varying noise levels (Manjón ). A Markov random field cleanup was used to improve the image quality. The gray and white matter images in native space were subsequently normalized to standard stereotactic space defined by the Montreal Neurological Institute using the diffeomorphic anatomical registration using the exponentiated Lie algebra algorithm approach (Ashburner, 2007). We used the predefined templates provided with the VBM8 toolbox, derived from 550 healthy brains from the IXI-database (http://www.brain-development.org). The normalized images were modulated using Jacobian determinants with non-linear warping only (i.e. m0 image in VBM8 outputs) to exclude the effect of total intracranial volume. Finally, the normalized modulated images were resampled to a resolution of 1.5 × 1.5 × 1.5 mm and smoothed using an isotropic Gaussian kernel 12 mm full width at half-maximum to compensate for anatomical variability among participants. Multiple regression analyses were performed using the averaged percent accuracy across conditions as the independent variable and sex, age, and full-scale IQ as covariates. The positive and negative relationships between gray and white matter volumes and the averaged percent accuracy across conditions were tested using t-statistics. We selected the bilateral IFG, MTG and the amygdala as regions of interest (ROI). Co-ordinates were derived from Sabatinelli as follows: bilateral IFG (right: x = 42, y = 25, z = 3; left: x = −42, y = 25, z = 3), MTG (right: x = 53, y = −50, z = 4; left: x = −53, y = −50, z = 4) and amygdala (right: x = 20, y = −4, z = −15; left: x = −20, y = −6, z = −15). The co-ordinates of the left MTG were generated by flipping those of the right MTG, because Sabatinelli did not report the involvement of the left MTG. The coordinates used in the present study are similar to those used in a meta-analysis (x = −56, y = −58, z = 4; Fusar-Poli ). To restrict the search volume in the bilateral IFG, MTG and amygdala, ROIs were specified as the intersection of a sphere of 12mm radius centered on the coordinates with anatomically defined masks provided by the WFU PickAtlas (Maldjian et al., 2003). We performed small volume correction (Worsley ) in each ROI. Significant voxels were identified at the height threshold of P < 0.001 (uncorrected), and then a family wise error (FWE) correction for multiple comparisons was applied (P < 0.05). Other areas were FWE-corrected for the entire brain volume. For exploratory purposes, we performed the analysis using a height threshold of P < 0.001 (uncorrected) with a liberal extent threshold of 100 contiguous voxels. The same whole brain and exploratory analyses were also conducted using the percent accuracy in each emotion category as independent variables. The brain structures were anatomically labeled using Talairach Client (Lancaster ) and the SPM Anatomy Toolbox (Eickhoff ).

Results

The mean percent accuracy for each condition is shown in Table 1.
Table 1

Mean (with s.d.) percentages of accurate facial expression recognition in each emotion category

CategoryAngerDisgustFearHappinessSadnessSurpriseTotal
Mean (%)57.863.864.099.886.092.077.2
s.d.20.022.124.71.815.116.77.1
Mean (with s.d.) percentages of accurate facial expression recognition in each emotion category To reveal which brain regions are involved in emotion recognition, the structural MRI data were analyzed using a multiple regression analysis, with the average percent accuracy across emotion conditions as the independent variable, and sex, age, and full-scale IQ as covariates. The ROI analysis indicated a significant positive relationship with the gray matter volume of the right IFG (P < 0.05, FWE corrected; Figure 1 and Table 2). There were no other significant clusters within ROIs or other entire brain regions. The exploratory analysis using a liberal extent threshold (k > 100) revealed a negative relationship with the white matter volume of the left cerebellum (Figure 1 and Table 2).
Fig. 1

Gray and white matter regions showing positive or negative relationships with the recognition accuracy of overall facial expressions. For display purposes, voxels are included above a threshold of P < 0.001 (uncorrected), with an extent threshold of 100 contiguous voxels. The positive correlation between overall emotion recognition and gray matter volume of the right inferior frontal gyrus was significant in the ROI analysis with an FWE correction for multiple comparisons. The blue cross indicates the location of the peak voxel. The red–white color scale represents the T-value. Scatter plots show the gray matter volume of the right inferior frontal gyrus (left bottom) and white matter volume of the left cerebellum (right bottom) as functions of the recognition accuracy of overall facial expressions at the peak voxels.

Table 2

Brain regions showing correlations between the percent accuracy of facial expression recognition and gray and white matter volume

SideBACorrelationCo-ordinates
T(45)-valueCluster size (voxels)
xyz
All, gray matter
  Inferior frontal gyrus, triangularisaR45+482774.05555
All, white matter
  Cerebellum, crus IIL−32−73−443.93154
Anger, gray matter
  Superior parietal lobuleL7−20−76494.141263
  Inferior parietal lobuleR4047−40523.93321
  Orbitofrontal cortexR102468−23.37207
Disgust, gray matter
  Precentral gyrusL6+−45−4214.01190
Fear, gray matter
  ThalamusR18−13153.96142
Sadness, gray matter
  Superior temporal gyrusR39+51−52243.79204
  Superior temporal gyrusR2166−9−53.77180
Sadness, white matter
  Superior temporal gyrusbL41c−59−2274.79774
Surprise, gray matter
  Superior temporal gyrusL22−63204.42233
  Parahippocampal gyrusL36−20−34−144.23238

The co-ordinates of the peak in the MNI system are shown. A threshold of P < 0.001 (uncorrected) with a liberal extent threshold of 100 contiguous voxels was set for displayed data.

BA, Broadmann area; L, left; R, right; +, positive correlation; −, negative correlation.

Significant positive correlation in the ROI analysis with a FWE correction for multiple comparisons (P < 0.05).

Significant negative correlation in the whole brain analysis with a FWE correction for multiple comparisons (P < 0.05).

Nearest cortical gray matter.

Gray and white matter regions showing positive or negative relationships with the recognition accuracy of overall facial expressions. For display purposes, voxels are included above a threshold of P < 0.001 (uncorrected), with an extent threshold of 100 contiguous voxels. The positive correlation between overall emotion recognition and gray matter volume of the right inferior frontal gyrus was significant in the ROI analysis with an FWE correction for multiple comparisons. The blue cross indicates the location of the peak voxel. The red–white color scale represents the T-value. Scatter plots show the gray matter volume of the right inferior frontal gyrus (left bottom) and white matter volume of the left cerebellum (right bottom) as functions of the recognition accuracy of overall facial expressions at the peak voxels. Brain regions showing correlations between the percent accuracy of facial expression recognition and gray and white matter volume The co-ordinates of the peak in the MNI system are shown. A threshold of P < 0.001 (uncorrected) with a liberal extent threshold of 100 contiguous voxels was set for displayed data. BA, Broadmann area; L, left; R, right; +, positive correlation; −, negative correlation. Significant positive correlation in the ROI analysis with a FWE correction for multiple comparisons (P < 0.05). Significant negative correlation in the whole brain analysis with a FWE correction for multiple comparisons (P < 0.05). Nearest cortical gray matter. Multiple regression analyses were also conducted regarding the relationship between the structural MRI data and the percent accuracy for each emotion condition. Although there were no significant clusters within ROIs, the white matter volume of the left STG negatively correlated with sadness recognition in the whole brain analysis (P < 0.05, FWE corrected; see Figure 2 and Table 2). When we conducted exploratory analyses using a liberal extent threshold (k > 100), several brain regions showed associations with the ability to recognize facial expressions (see Figure 2 and Table 2). Specifically, for anger recognition, the results revealed a negative relationship with the gray matter volume of the left superior parietal lobule, the right inferior parietal lobule, and the right orbitofrontal cortex. For disgust recognition, a positive relationship was found with the gray matter volume of the left precentral gyrus. For fear recognition, a negative relationship was found with the gray matter volume of the right thalamus. For sadness recognition, a positive relationship was found with the gray matter volume of the right STG. Negative relationships were also found with the gray matter volume of the right STG. For surprise recognition, negative relationships were found with the gray matter volume of the left STG and parahippocampal gyrus.
Fig. 2

Gray and white matter regions showing a positive or negative relationship with recognition accuracy in each emotion category. For display purposes, voxels are included above a threshold of P < 0.001 (uncorrected), with an extent threshold of 100 contiguous voxels. The negative correlation between sadness recognition and white matter volume of the left superior temporal gyrus was significant in the whole brain analysis with an FWE correction for multiple comparisons. The blue cross indicates the location of the peak voxel. The red–white color scale represents the T-value. (+) and (–) show positive and negative correlation, respectively. IPL, inferior parietal lobule; OFC, orbitofrontal cortex; PHG, parahippocampal gyrus; PCG, precentral gyrus; SPL, superior parietal lobule; STG, superior temporal gyrus.

Gray and white matter regions showing a positive or negative relationship with recognition accuracy in each emotion category. For display purposes, voxels are included above a threshold of P < 0.001 (uncorrected), with an extent threshold of 100 contiguous voxels. The negative correlation between sadness recognition and white matter volume of the left superior temporal gyrus was significant in the whole brain analysis with an FWE correction for multiple comparisons. The blue cross indicates the location of the peak voxel. The red–white color scale represents the T-value. (+) and (–) show positive and negative correlation, respectively. IPL, inferior parietal lobule; OFC, orbitofrontal cortex; PHG, parahippocampal gyrus; PCG, precentral gyrus; SPL, superior parietal lobule; STG, superior temporal gyrus.

Discussion

The present study reveals that the gray matter volume of the right IFG is associated with the ability to recognize facial expressions, across six emotion categories. This result is consistent with meta-analyses of neuroimaging data, which have reported that the IFG is reliably activated during the observation of facial expressions (Fusar-Poli ; Sabatinelli ). Furthermore, lesion studies have demonstrated that the right IFG is involved in the ability to recognize facial expressions (Adolphs ; Shamay-Tsoory ; Dal Monte ). Previous studies have also suggested that people with more accurate expression recognition have a larger gray matter volume in the right IFG. For example, women, who can recognize facial expressions more accurately than men (Kret and de Gelder, 2012), have a larger right IFG compared with men (Ruigrok ). No previous studies have revealed a relationship between individual differences in the ability to recognize overall facial expressions and brain structure in healthy individuals, although there is some evidence for a relationship in patients with psychiatric and neurological disorders (e.g. frontotemporal dementia: Van den Stock ; Parkinson’s disease: Ibarretxe-Bilbao ). Our data suggest that individual differences in the ability to recognize emotions from facial cues are reflected in individual structural differences in the right IFG. Although we are unable to conclusively identify a functional role of the right IFG, we suggest that it may contribute toward the ability to recognize facial expressions via a mechanism involving mimicking the other person’s facial expression. It has been suggested that the IFG contains mirror neurons that discharge when observing and executing specific actions, and matching these representations allows us to understand each other’s actions (Rizzolatti ; Gallese ). Consistent with this concept, behavioral studies have reported that observing facial expressions induces facial mimicry, and automatic and intentional facial imitation modulate the process of recognizing facial expressions of other people (Niedenthal, 2007; Oberman ; Sato a; Hyniewska and Sato, 2015). Previous fMRI studies have shown that observing and imitating facial expressions activates the right IFG (Carr ; Hennenlotter ; Pfeifer ). Some studies have provided further evidence for the role of the right IFG, indicating that the right IFG shows greater activation when participants imitate emotional facial expressions compared with ingestive facial expressions (Lee ), and that stronger congruent facial movements with another’s facial expressions correlate with increased activation of the right IFG (Lee ; Likowski ). Together with these previous findings, it is possible that the increased volume of the right IFG reflects enhanced facial mimicry, which facilitates the ability to recognize facial expressions. Interestingly, our exploratory analysis suggests that the white matter volume of the left cerebellum negatively correlates with the ability to recognize facial expressions. Recent studies have indicated that the cerebellum contributes to not only motor but also cognitive and emotional function (Stoodley and Schmahmann, 2009). Consistent with this, a meta-analysis of fMRI studies revealed that the left cerebellum is active while observing others' facial expressions (Fusar-Poli ). In relation to expression recognition, however, only a few studies have investigated the role of the cerebellum. Previous studies have reported impaired recognition in individuals with cerebellum infraction (Adamaszek ) and spinocerebellar ataxias (D’Agata ). A stimulation study demonstrated that a transcranial direct current stimulation to the cerebellum enhanced the processing of negative facial expressions (Ferrucci ). The cerebellum is part of a large-scale network that involves the neocortex. The identified voxels were located in the Crus II of the left hemisphere, which is structurally and functionally interconnected with the contralateral prefrontal cortex (Kelly and Strick, 2003; O’Reilly ). Both the right IFG and the left cerebellum have been associated with congruent facial reactions in response to the facial expressions of others (Likowski ). Based on these findings, we speculate that the network between the right IFG and the left cerebellum might play a critical role for recognizing others’ facial expressions. It should be noted that the IFG and the cerebellum are involved in general cognitive functions (Stoodley and Schmahmann, 2009; Aron ). A previous study suggested that executive function is related to the recognition of facial expressions (Circelli ). These findings suggest that individual differences in these functions might explain the relationship between the ability to recognize facial expressions and the volume of the identified brain regions. However, in our study, care was taken to ensure that all participants appropriately understood the meanings of the written labels prior to the experiment. No time limits were set for conducting the task and the stimuli and the written labels were presented until their response was recorded. Thus, the task should not have burdened attentional processing, working memory, language processing, or inhibitory control processes. Given that the significant relationship between brain volume and the ability to recognize emotions was found after controlling for participants’ intellectual ability, it is unlikely that the relationship may be explained only by individual differences in general cognitive function. The present study provides insight into the functional role of the right IFG in clinical populations. For example, individuals with ASD, who are characterized by social and communication impairments, have difficulty recognizing facial expressions (Uljarevic and Hamilton, 2013). Studies have reported that individuals with ASD show reduced facial mimicry (McIntosh ; Yoshimura ) and reduced right IFG activation in response to the facial expressions of other people (Dapretto ; Hadjikhani ; Sato ). Anatomical studies have demonstrated a relationship between a reduction in the gray matter volume of the right IFG and impaired social communication (Kosaka ; Yamasaki ). These data are consistent with our speculation that the increased gray matter volume of the right IFG is associated with enhanced facial mimicry, which facilitates emotion recognition. Based on these findings, it is possible that impaired facial mimicry, which is implemented by the IFG, results in an impaired ability to recognize facial expressions in ASD. A VBM study with a large clinical sample is needed to investigate the relationship between structural abnormalities and impaired emotion recognition in ASD. The exploratory analysis conducted for each emotion category found that gray and white matter volumes of distinct brain regions correlate with recognition accuracy. Importantly, we found that the recognition accuracy of sad and surprised faces correlated with the volumes of subregions of the bilateral STS region. Previous lesion symptom mapping studies demonstrated that the STS region and frontal cortex are critical areas involved in the recognition of another’s facial emotions (Adolphs ; Dal Monte ). In fact, a recent TMS study in healthy adults showed that TMS of the STS region impairs facial expression recognition (Pitcher, 2014). These findings are in accordance with the role of the STS region, i.e. the visual processing of invariant aspects of faces (Haxby ). Consistent with the suggestion described above, the STS region has a direct connection with the IFG (Catani ), and this connection may implement motor mimicry (Hamilton, 2008). Although some correlations found in the present study have also been reported in previous studies (angry face and the orbitofrontal cortex: Blair and Cipolotti, 2000; fearful face and the thalamus: Williams ), we did not detect other frequently reported correlations, such as fearful faces–amygdala (e.g. Adolphs ) and disgusted faces–insular and/or basal ganglia (e.g. Calder ). Methodological differences and a relatively small sample size might explain these discrepancies. Future studies are needed to combine structural and functional MRI investigations of a larger number of participants. The present study had additional limitations. First, an analysis of the recognition of a happy face was not conducted, because all but one participant recognized happy faces at the ceiling level. It would be useful to apply computer morphing paradigms to the stimulus photographs to adjust the difficulty levels across emotion categories (e.g. Sato ). Second, the results did not suggest any significant involvement of brain regions related with rapid emotional processing, such as the amygdala (e.g. Sato ). This might be because no time pressure was established for the task. The requirement of an accelerated response may reveal significant relationships between performance and the volume of such brain regions (cf. Zhao ). Third, although we emphasized facial mimicry as a functional role of the right IFG in emotion recognition, there is no evidence indicating whether the volume of the right IFG correlates with the magnitude of facial muscle activity in response to another’s facial expressions. The emotion recognition task used employed several processing stages, such as perceptual processing, motor resonance and lexical access to emotional meaning. For example, a previous study suggested that damage to the right frontal cortex contributes to the performance of naming rather than rating another’s facial expressions (Adolphs ). Future studies should directly investigate the relationship among facial mimicry, emotion recognition and brain volume. In summary, the ability to recognize facial expressions across emotion categories correlates with the gray matter volume of the right IFG. Based on previous findings that indicate that the IFG is involved in facial imitation, and that facial mimicry modulates the ability to recognize facial expressions, it is possible that the large gray matter volume of the right IFG contributes toward efficient facial mimicry, which facilitates emotion recognition.

Funding

This study was supported by the JSPS Funding Program for Next Generation World-Leading Researchers (LZ008). The funding sources had no involvement in study design; in the collection, analysis and interpretation of data; in the writing of the report; and in the decision to submit the article for publication. The authors have declared that no competing interests exist. Conflict of interest. None declared.
  72 in total

1.  Fast and robust parameter estimation for statistical partial volume models in brain MRI.

Authors:  Jussi Tohka; Alex Zijdenbos; Alan Evans
Journal:  Neuroimage       Date:  2004-09       Impact factor: 6.556

2.  A unified statistical approach for determining significant signals in images of cerebral activation.

Authors:  K J Worsley; S Marrett; P Neelin; A C Vandal; K J Friston; A C Evans
Journal:  Hum Brain Mapp       Date:  1996       Impact factor: 5.038

Review 3.  Emotion processing deficits: a liability spectrum providing insight into comorbidity of mental disorders.

Authors:  Mariska E Kret; Annemie Ploeger
Journal:  Neurosci Biobehav Rev       Date:  2015-02-25       Impact factor: 8.989

4.  The specific impairment of fearful expression recognition and its atypical development in pervasive developmental disorder.

Authors:  Shota Uono; Wataru Sato; Motomi Toichi
Journal:  Soc Neurosci       Date:  2011-09-15       Impact factor: 2.083

5.  Recognition of facial emotion in nine individuals with bilateral amygdala damage.

Authors:  R Adolphs; D Tranel; S Hamann; A W Young; A J Calder; E A Phelps; A Anderson; G P Lee; A R Damasio
Journal:  Neuropsychologia       Date:  1999-09       Impact factor: 3.139

Review 6.  A review on sex differences in processing emotional signals.

Authors:  M E Kret; B De Gelder
Journal:  Neuropsychologia       Date:  2012-01-08       Impact factor: 3.139

7.  Imitating expressions: emotion-specific neural substrates in facial mimicry.

Authors:  Tien-Wen Lee; Oliver Josephs; Raymond J Dolan; Hugo D Critchley
Journal:  Soc Cogn Affect Neurosci       Date:  2006-09       Impact factor: 3.436

8.  Smaller insula and inferior frontal volumes in young adults with pervasive developmental disorders.

Authors:  Hirotaka Kosaka; Masao Omori; Toshio Munesue; Makoto Ishitobi; Yukiko Matsumura; Tetsuya Takahashi; Kousuke Narita; Tetsuhito Murata; Daisuke N Saito; Hitoshi Uchiyama; Tomoyo Morita; Mitsuru Kikuchi; Kimiko Mizukami; Hidehiko Okazawa; Norihiro Sadato; Yuji Wada
Journal:  Neuroimage       Date:  2010-02-01       Impact factor: 6.556

9.  Inhibition and the right inferior frontal cortex: one decade on.

Authors:  Adam R Aron; Trevor W Robbins; Russell A Poldrack
Journal:  Trends Cogn Sci       Date:  2014-01-15       Impact factor: 20.229

10.  Neuroanatomical correlates of biological motion detection.

Authors:  Sharon Gilaie-Dotan; Ryota Kanai; Bahador Bahrami; Geraint Rees; Ayse P Saygin
Journal:  Neuropsychologia       Date:  2012-12-02       Impact factor: 3.139

View more
  15 in total

1.  Neural time course and brain sources of facial attractiveness vs. trustworthiness judgment.

Authors:  Manuel G Calvo; Aida Gutiérrez-García; David Beltrán
Journal:  Cogn Affect Behav Neurosci       Date:  2018-12       Impact factor: 3.282

2.  The Neglected Cerebello-Limbic Pathways and Neuropsychological Features of the Cerebellum in Emotion.

Authors:  Paolo Flace; Angelo Quartarone; Giovanni Colangelo; Demetrio Milardi; Alberto Cacciola; Giuseppina Rizzo; Paolo Livrea; Giuseppe Anastasi
Journal:  Cerebellum       Date:  2018-04       Impact factor: 3.847

3.  Emotion recognition in individuals with cocaine use disorder: the role of abstinence length and the social brain network.

Authors:  Rachel A Rabin; Muhammad A Parvaz; Nelly Alia-Klein; Rita Z Goldstein
Journal:  Psychopharmacology (Berl)       Date:  2021-06-05       Impact factor: 4.530

4.  The structural neural correlates of atypical facial expression recognition in autism spectrum disorder.

Authors:  Shota Uono; Wataru Sato; Takanori Kochiyama; Sayaka Yoshimura; Reiko Sawada; Yasutaka Kubota; Morimitsu Sakihama; Motomi Toichi
Journal:  Brain Imaging Behav       Date:  2022-01-20       Impact factor: 3.978

5.  How context influences the interpretation of facial expressions: a source localization high-density EEG study on the "Kuleshov effect".

Authors:  Marta Calbi; Francesca Siri; Katrin Heimann; Daniel Barratt; Vittorio Gallese; Anna Kolesnikov; Maria Alessandra Umiltà
Journal:  Sci Rep       Date:  2019-02-14       Impact factor: 4.379

6.  Out of Context, Beyond the Face: Neuroanatomical Pathways of Emotional Face-Body Language Integration in Adolescent Offenders.

Authors:  Hernando Santamaría-García; Agustin Ibáñez; Synella Montaño; Adolfo M García; Michel Patiño-Saenz; Claudia Idarraga; Mariana Pino; Sandra Baez
Journal:  Front Behav Neurosci       Date:  2019-02-26       Impact factor: 3.558

7.  Volume of the right supramarginal gyrus is associated with a maintenance of emotion recognition ability.

Authors:  Sayaka Wada; Motoyasu Honma; Yuri Masaoka; Masaki Yoshida; Nobuyoshi Koiwa; Haruko Sugiyama; Natsuko Iizuka; Satomi Kubota; Yumika Kokudai; Akira Yoshikawa; Shotaro Kamijo; Sawa Kamimura; Masahiro Ida; Kenjiro Ono; Hidetoshi Onda; Masahiko Izumizaki
Journal:  PLoS One       Date:  2021-07-22       Impact factor: 3.240

8.  The interaction between embodiment and empathy in facial expression recognition.

Authors:  Karine Jospe; Agnes Flöel; Michal Lavidor
Journal:  Soc Cogn Affect Neurosci       Date:  2018-02-01       Impact factor: 3.436

9.  Familiarity with children improves the ability to recognize children's mental states: an fMRI study using the Reading the Mind in the Eyes Task and the Nencki Children Eyes Test.

Authors:  Jan Szczypiński; Anna Alińska; Marek Waligóra; Maciej Kopera; Aleksandra Krasowska; Aneta Michalska; Hubert Suszek; Andrzej Jakubczyk; Marek Wypych; Marcin Wojnar; Artur Marchewka
Journal:  Sci Rep       Date:  2020-07-31       Impact factor: 4.379

10.  Facial and neural mechanisms during interactive disclosure of biographical information.

Authors:  Roser Cañigueral; Xian Zhang; J Adam Noah; Ilias Tachtsidis; Antonia F de C Hamilton; Joy Hirsch
Journal:  Neuroimage       Date:  2020-11-19       Impact factor: 7.400

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.