| Literature DB >> 29896135 |
Shushi Namba1, Russell S Kabir1, Makoto Miyatani2, Takashi Nakao2.
Abstract
Accurately gauging the emotional experience of another person is important for navigating interpersonal interactions. This study investigated whether perceivers are capable of distinguishing between unintentionally expressed (genuine) and intentionally manipulated (posed) facial expressions attributed to four major emotions: amusement, disgust, sadness, and surprise. Sensitivity to this discrimination was explored by comparing unstaged dynamic and static facial stimuli and analyzing the results with signal detection theory. Participants indicated whether facial stimuli presented on a screen depicted a person showing a given emotion and whether that person was feeling a given emotion. The results showed that genuine displays were evaluated more as felt expressions than posed displays for all target emotions presented. In addition, sensitivity to the perception of emotional experience, or discriminability, was enhanced in dynamic facial displays, but was less pronounced in the case of static displays. This finding indicates that dynamic information in facial displays contributes to the ability to accurately infer the emotional experiences of another person.Entities:
Keywords: dynamics; emotion; facial expressions; posed facial expressions; spontaneous facial expressions
Year: 2018 PMID: 29896135 PMCID: PMC5987704 DOI: 10.3389/fpsyg.2018.00672
Source DB: PubMed Journal: Front Psychol ISSN: 1664-1078
List of the percentage of Yes responses that emerged in judgment conditions and facial displays.
| Show condition (% yes) | Feel condition (% yes) | |||
|---|---|---|---|---|
| Display type | Dynamic | Static | Dynamic | Static |
| Neutral | 2 | 12 | 10 | 17 |
| Posed | 88 | 78 | 31 | 59 |
| Genuine | 78 | 70 | 75 | 66 |
| Neutral | 0 | 2 | 0 | 0 |
| Posed | 98 | 91 | 38 | 80 |
| Genuine | 95 | 95 | 85 | 86 |
| Neutral | 3 | 2 | 7 | 2 |
| Posed | 95 | 70 | 23 | 47 |
| Genuine | 73 | 46 | 75 | 52 |
| Neutral | 5 | 32 | 13 | 46 |
| Posed | 90 | 86 | 32 | 75 |
| Genuine | 97 | 96 | 75 | 66 |
| Neutral | 0 | 13 | 18 | 18 |
| Posed | 67 | 66 | 32 | 34 |
| Genuine | 48 | 43 | 63 | 59 |
Estimated parameters on each condition for all emotions using a signal detection model.
| Parameter | MAP | 95%CI[] |
|---|---|---|
| Response criteria (Beta1) | 1.06 | [0.86, 1.25] |
| Sensitivity to display (Beta2) | -0.38 | [-0.57, -0.16] |
| Response criteria between presentations (Beta3) | 0.46 | [0.07, 0.82] |
| Sensitivity to display between presentations (Beta4) | -0.13 | [-0.56, 0.25] |
| Response criteria (Beta1) | -0.12 | [-0.25, -0.01] |
| Sensitivity to display (Beta2) | 0.66 | [0.50, 0.85] |
| Response Criteria between presentations (Beta3) | -0.72 | [-0.99, -0.49] |
| Sensitivity to display between presentations (Beta4) | 1.03 | [0.64, 1.35] |
Estimated parameters on show condition across each emotion using a Bayesian signal detection model.
| Parameters | MAP | 95%CI[] |
|---|---|---|
| Response criteria (Beta1) | 1.73 | [1.38, 2.36] |
| Sensitivity to display (Beta2) | -0.10 | [-0.80, 0.48] |
| Response criteria between presentations (Beta3) | 0.81 | [0.01, 1.94] |
| Sensitivity to display between presentations (Beta4) | -0.80 | [-2.15, 0.29] |
| Response criteria (Beta1) | 1.10 | [0.79, 1.44] |
| Sensitivity to display (Beta2) | -0.78 | [-1.25, -0.44] |
| Response criteria between presentations (Beta3) | 1.15 | [0.55, 1.84] |
| Sensitivity to display between presentations (Beta4) | -0.34 | [-1.25, 0.30] |
| Response criteria (Beta1) | 1.18 | [0.90, 1.49] |
| Sensitivity to display (Beta2) | 0.69 | [0.19, 1.24] |
| Response criteria between presentations (Beta3) | 0.13 | [-0.39, 0.82] |
| Sensitivity to display between presentations (Beta4) | -0.13 | [-1.34, 0.86] |
| Response criteria (Beta1) | 0.43 | [0.19, 0.65] |
| Sensitivity to display (Beta2) | -0.53 | [-0.86, -0.21] |
| Response criteria between presentations (Beta3) | 0.06 | [-0.46, 0.50] |
| Sensitivity to display between presentations (Beta4) | 0.13 | [-0.55, 0.78] |
Estimated parameters on feel condition across each emotion using a Bayesian signal detection model.
| Parameters | MAP | 95%CI[] |
|---|---|---|
| Response criteria (Beta1) | 0.26 | [0.04, 0.52] |
| Sensitivity to display (Beta2) | 0.80 | [0.42, 1.15] |
| Response criteria between presentations (Beta3) | -1.16 | [-1.66, -0.66] |
| Sensitivity to display between presentations (Beta4) | 1.13 | [0.39, 1.87] |
| Response criteria (Beta1) | -0.40 | [-0.64, -0.15] |
| Sensitivity to display (Beta2) | 0.80 | [0.41, 1.09] |
| Response criteria between presentations (Beta3) | -0.68 | [-1.14, -0.19] |
| Sensitivity to display between presentations (Beta4) | 1.30 | [0.61, 1.96] |
| Response criteria (Beta1) | 0.10 | [-0.14, 0.34] |
| Sensitivity to display (Beta2) | 0.42 | [0.12, 0.78] |
| Response criteria between presentations (Beta3) | -1.16 | [-1.64, -0.65] |
| Sensitivity to display between presentations (Beta4) | 1.42 | [0.71, 2.08] |
| Response criteria (Beta1) | -0.46 | [-0.69, -0.20] |
| Sensitivity to display (Beta2) | 0.72 | [0.41, 1.06] |
| Response criteria between presentations (Beta3) | -0.08 | [-0.54, 0.42] |
| Sensitivity to display between presentations (Beta4) | 0.16 | [-0.48, 0.84] |