| Literature DB >> 35369259 |
Claus-Christian Carbon1,2, Marco Jürgen Held1,2, Astrid Schütz1,2.
Abstract
The ability to read emotions in faces helps humans efficiently assess social situations. We tested how this ability is affected by aspects of familiarization with face masks and personality, with a focus on emotional intelligence (measured with an ability test, the MSCEIT, and a self-report scale, the SREIS). To address aspects of the current pandemic situation, we used photos of not only faces per se but also of faces that were partially covered with face masks. The sample (N = 49), the size of which was determined by an a priori power test, was recruited in Germany and consisted of healthy individuals of different ages [M = 24.8 (18-64) years]. Participants assessed the emotional expressions displayed by six different faces determined by a 2 (sex) × 3 (age group: young, medium, and old) design. Each person was presented with six different emotional displays (angry, disgusted, fearful, happy, neutral, and sad) with or without a face mask. Accuracy and confidence were lower with masks-in particular for the emotion disgust (very often misinterpreted as anger) but also for happiness, anger, and sadness. When comparing the present data collected in July 2021 with data from a different sample collected in May 2020, when people first started to familiarize themselves with face masks in Western countries during the first wave of the COVID-19 pandemic, we did not detect an improvement in performance. There were no effects of participants' emotional intelligence, sex, or age regarding their accuracy in assessing emotional states in faces for unmasked or masked faces.Entities:
Keywords: COVID-19 pandemic; accuracy; cover; emotion perception; emotional intelligence; face mask; face perception; personality
Year: 2022 PMID: 35369259 PMCID: PMC8967961 DOI: 10.3389/fpsyg.2022.856971
Source DB: PubMed Journal: Front Psychol ISSN: 1664-1078
Figure 1The figure illustrates the six emotional variations (anger, disgust, fear, happiness, neutral, and sadness) of one person without (A) and with (B) a face mask. This specific person was not part of our experimental material but is presented here for illustrative purposes. The authors would like to thank the Max Planck Institute for providing the baseline stimuli (without masks), which came from the MPI FACES database (Ebner et al., 2010).
Figure 2The figure demonstrates mean performance levels for assessments of emotional states for faces without masks (red) compared with faces with masks (blue). Error bars indicate 95% confidence intervals (CIs) according to Morey (2008). Pairwise comparisons of the presentation conditions were calculated via undirected paired t-tests. *p < 0.05. ****p < 0.0001. Nonsignificant results are marked with ns.
Figure 3This figure shows the confusion matrices for expressed versus perceived emotions for the original faces without face masks (top red) and faces with face masks (bottom blue). Mean performance levels in assessing the emotional states are given by percentage correctness rates (if >0.5%, otherwise data were suppressed for better readability of the matrices). The better the performance, the more saturated were the confusion matrix cells.
Figure 4This figure shows mean confidence levels for assessments of emotional states for faces without masks (red) compared with faces with masks (blue). Error bars indicate 95% CIs according to Morey (2008). Pairwise comparisons of the presentation conditions were calculated via paired t-tests. ****p < 0.0001. Nonsignificant results are marked with ns.
Comparison of different linear mixed effects models.
| Dependent variable/tested model |
| AIC | logLik | Cond. | Against | |
|---|---|---|---|---|---|---|
|
| ||||||
| #0: null | 9 | 35,483 | −17,732 | 0.128 | ||
| #1: + Mask | 10 | 35,316 | −17,648 | 0.168 | #0 | <0.0001 |
| #2: + EI + SREIS | 12 | 35,320 | −17,648 | 0.168 | #1 | 0.9081 |
| #3a: + FamiliarityOthers | 11 | 30,243 | −15,111 | #1 | <0.0001 | |
| #3b: + FamiliarityOwn | 11 | 35,318 | −17,648 | 0.168 | #1 | 0.8360 |
| #4: + attitudeMasks | 11 | 35,318 | −17,648 | 0.168 | #1 | 0.5975 |
| #5: + exprEmo:Mask | 15 | 35,177 | −17,527 | 0.224 | #1 | <0.0001 |
|
| ||||||
| #0: null | 9 | 37,253 | −15,317 | 0.240 | ||
| #1: + Mask | 10 | 30,246 | −15,113 | 0.324 | #0 | <0.0001 |
| #2: + EI + SREIS | 12 | 30,324 | −15,113 | 0.324 | #1 | 0.9468 |
| #3a: + FamiliarityOthers | 11 | 30,243 | −15,111 | 0.324 | #1 | 0.0325 |
| #3a: + FamiliarityOwn | 11 | 30,245 | −15,111 | 0.324 | #1 | 0.0838 |
| #4: + attitudeMasks | 11 | 30,247 | −15,113 | 0.324 | #1 | 0.5115 |
The table shows the results of linear mixed effects analysis of different models in comparison with less complex models, separated by the two tested dependent variables % correct (percentage of correct emotion classifications) and % confidence (for correct emotion classifications). FS, fixed slopes (fixed factors); RS, random slopes (random factors); df, degrees of freedom; .
Results of the linear mixed effects analysis for emotion recognition performance testing Model 5 against Model 1.
| Predictors | Estimates |
|
|
|---|---|---|---|
| (Intercept) | 93.20 *** |
| 3,514.00 |
| Neutral |
| ||
| Anger | −4.76 | 0.094 | 3,514.00 |
| Disgust | −1.70 | 0.549 | 3,514.00 |
| Fear | −1.70 | 0.549 | 3,514.00 |
| Happiness | 6.46 * |
| 3,514.00 |
| Sadness | −22.79 *** |
| 3,514.00 |
| exprEmo_anger:Mask | −17.35 *** |
| 3,514.00 |
| exprEmo_disgust:Mask | −52.72 *** |
| 3,514.00 |
| exprEmo_fear:Mask | −0.00 | 1.000 | 3,514.00 |
| exprEmo_happiness:Mask | −26.53 *** |
| 3,514.00 |
| exprEmo_sadness:Mask | −8.40 * |
| 3,514.00 |
| No mask |
| ||
| Mask | 1.70 | 0.549 | 3,514.00 |
| ICC | 0.05 | ||
| 6 | |||
| 49 | |||
| Observations | 3,529 | ||
| Marginal | 0.179/0.224 | ||
| AIC | 3,5084.229 | ||
| Log-likelihood | −17,527.114 | ||
>The table shows the statistics for all involved fixed effects in the linear mixed effects analysis for Model 5, regarding the tested dependent variable % correct (percentage of correct emotion classifications). Abbreviated notations for the terms were used to save space: exprEmo_XY = facial emotion, e.g., anger; Mask = face with face mask. Significant values of .