| Literature DB >> 34149508 |
Sabrina N Grondhuis1, Angela Jimmy1, Carolina Teague1, Nicolas M Brunet1.
Abstract
Previous studies have found it is more difficult identifying an emotional expression displayed by an older than a younger face. It is unknown whether this is caused by age-related changes such as wrinkles and folds interfering with perception, or by the aging of facial muscles, potentially reducing the ability of older individuals to display an interpretable expression. To discriminate between these two possibilities, participants attempted to identify facial expressions under different conditions. To control for the variables (wrinkles/folds vs facial muscles), we used Generative Adversarial Networks to make faces look older or younger. Based upon behavior data collected from 28 individuals, our model predicts that the odds of correctly identifying the expressed emotion of a face reduced 16.2% when younger faces (condition 1) are artificially aged (condition 3). Replacing the younger faces with natural old-looking faces (Condition 2), however, results in an even stronger effect (odds of correct identification decreased by 50.9%). Counterintuitively, making old faces (Condition 2) look young (Condition 4) results in the largest negative effect (odds of correct identification decreased by 74.8% compared with natural young faces). Taken together, these results suggest that both age-related decline in the facial muscles' ability to express facial emotions and age-related physical changes in the face, explain why it is difficult to recognize facial expressions from older faces; the effect of the former, however, is much stronger than that of the latter. Facial muscle exercises, therefore, might improve the capacity to convey facial emotional expressions in the elderly.Entities:
Keywords: aging related; artificial aging; emotion recognition; emotional expressions; facial expressions; facial muscles
Year: 2021 PMID: 34149508 PMCID: PMC8211723 DOI: 10.3389/fpsyg.2021.620768
Source DB: PubMed Journal: Front Psychol ISSN: 1664-1078
FIGURE 1Example stimuli and schematic representation hypotheses. an example set (not used for the actual experiment, but representative) to illustrate the stimuli employed for Condition 1, showing a young adult posing for four different negative facial expressions (sadness, disgust, fear, and anger; the neutral expression, also used for the experiment is not shown). example set to illustrate how the stimuli for condition 3, using an aging filter, were generated. an example set (not used for the actual experiment, but representative) to illustrate the stimuli employed for Condition 2, showing an older male posing for four different negative facial expressions (sadness, disgust, fear and anger; the neutral expression, also used for the experiment is not shown). example set to illustrate how the stimuli for condition 4, using a reversed aging filter, were generated. Colored arrows (purple and orange) and associated text indicate the expected effects on the participant’s ability to identify the correct facial expression in either of two cases: if physical age-related changes such as wrinkles and appearance of the skin hampers identification of the emotional expression (Hypothesis 1, purple), or if weakness of muscle, caused by aging, makes it harder to produce recognizable facial expressions (Hypothesis 2, orange).
Results of logistic regression for accurate identifications of facial expressions.
| Intercept | 1.225 | 0.068 | 18.138 | <0.001 | 3.403 | 2.981 | 3.885 |
| Intercept | 1.783 | 0.094 | 19.065 | <0.001 | 5.949 | 4.952 | 7.145 |
| Young to old | –0.161 | 0.054 | –2.982 | 0.003 | 0.851 | 0.766 | 0.946 |
| Old to young | –1.209 | 0.093 | –13.072 | <0.001 | 0.299 | 0.249 | 0.358 |
| Old | –0.642 | 0.057 | –11.251 | <0.001 | 0.526 | 0.471 | 0.589 |
| Intercept | 4.063 | 0.238 | 17.048 | <0.001 | 58.172 | 36.459 | 92.817 |
| Young to old | –0.177 | 0.058 | –3.023 | 0.003 | 0.838 | 0.747 | 0.940 |
| Old to young | –1.377 | 0.108 | –12.742 | <0.001 | 0.252 | 0.204 | 0.312 |
| Old | –0.712 | 0.063 | –11.381 | <0.001 | 0.491 | 0.434 | 0.555 |
| Anger | –2.748 | 0.240 | –11.435 | <0.001 | 0.064 | 0.040 | 0.103 |
| Afraid | –1.366 | 0.213 | –6.399 | <0.001 | 0.255 | 0.168 | 0.388 |
| Disgust | –2.808 | 0.227 | –12.375 | <0.001 | 0.060 | 0.039 | 0.094 |
| Sad | –2.681 | 0.268 | –10.015 | <0.001 | 0.069 | 0.041 | 0.116 |
Fixed effects in logistic regression model.
| Corrected model | 66.067 | 3 | 10,995 | <0.001 |
| Face | 66.067 | 3 | 10,995 | <0.001 |
| Corrected model | 38.609 | 7 | 10,991 | <0.001 |
| Face | 64.511 | 3 | 10,991 | <0.001 |
| Emotional expression | 47.222 | 4 | 10,991 | <0.001 |
Random effects in logistic regression model.
| Intercept (subjects) | 0.117 | 0.036 | 3.276 | 0.001 | 0.065 | 0.214 |
| Intercept (subjects) | 0.129 | 0.039 | 3.293 | <0.001 | 0.071 | 0.234 |
| Intercept (subjects) | 0.164 | 0.050 | 3.257 | 0.001 | 0.090 | 0.300 |
| Photographic models (subjects) | 0.151 | 0.036 | 4.220 | <0.001 | 0.095 | 0.240 |