Literature DB >> 25705105

Younger and Older Users' Recognition of Virtual Agent Facial Expressions.

Jenay M Beer1, Cory-Ann Smarr2, Arthur D Fisk2, Wendy A Rogers2.   

Abstract

As technology advances, robots and virtual agents will be introduced into the home and healthcare settings to assist individuals, both young and old, with everyday living tasks. Understanding how users recognize an agent's social cues is therefore imperative, especially in social interactions. Facial expression, in particular, is one of the most common non-verbal cues used to display and communicate emotion in on-screen agents (Cassell, Sullivan, Prevost, & Churchill, 2000). Age is important to consider because age-related differences in emotion recognition of human facial expression have been supported (Ruffman et al., 2008), with older adults showing a deficit for recognition of negative facial expressions. Previous work has shown that younger adults can effectively recognize facial emotions displayed by agents (Bartneck & Reichenbach, 2005; Courgeon et al. 2009; 2011; Breazeal, 2003); however, little research has compared in-depth younger and older adults' ability to label a virtual agent's facial emotions, an import consideration because social agents will be required to interact with users of varying ages. If such age-related differences exist for recognition of virtual agent facial expressions, we aim to understand if those age-related differences are influenced by the intensity of the emotion, dynamic formation of emotion (i.e., a neutral expression developing into an expression of emotion through motion), or the type of virtual character differing by human-likeness. Study 1 investigated the relationship between age-related differences, the implication of dynamic formation of emotion, and the role of emotion intensity in emotion recognition of the facial expressions of a virtual agent (iCat). Study 2 examined age-related differences in recognition expressed by three types of virtual characters differing by human-likeness (non-humanoid iCat, synthetic human, and human). Study 2 also investigated the role of configural and featural processing as a possible explanation for age-related differences in emotion recognition. First, our findings show age-related differences in the recognition of emotions expressed by a virtual agent, with older adults showing lower recognition for the emotions of anger, disgust, fear, happiness, sadness, and neutral. These age-related difference might be explained by older adults having difficulty discriminating similarity in configural arrangement of facial features for certain emotions; for example, older adults often mislabeled the similar emotions of fear as surprise. Second, our results did not provide evidence for the dynamic formation improving emotion recognition; but, in general, the intensity of the emotion improved recognition. Lastly, we learned that emotion recognition, for older and younger adults, differed by character type, from best to worst: human, synthetic human, and then iCat. Our findings provide guidance for design, as well as the development of a framework of age-related differences in emotion recognition.

Entities:  

Keywords:  Aging; Emotion expression; Emotion recognition; Older adults; Virtual agents; Younger adults

Year:  2015        PMID: 25705105      PMCID: PMC4331019          DOI: 10.1016/j.ijhcs.2014.11.005

Source DB:  PubMed          Journal:  Int J Hum Comput Stud        ISSN: 1071-5819            Impact factor:   3.632


  37 in total

1.  Deciphering the enigmatic face: the importance of facial dynamics in interpreting subtle facial expressions.

Authors:  Zara Ambadar; Jonathan W Schooler; Jeffrey F Cohn
Journal:  Psychol Sci       Date:  2005-05

2.  Quantifying facial expression recognition across viewing conditions.

Authors:  Deborah Goren; Hugh R Wilson
Journal:  Vision Res       Date:  2005-12-20       Impact factor: 1.886

3.  Age effects on social cognition: faces tell a different story.

Authors:  Michelle L Keightley; Gordon Winocur; Hana Burianova; Donaya Hongwanishkul; Cheryl L Grady
Journal:  Psychol Aging       Date:  2006-09

4.  Age differences in recognition of emotion in lexical stimuli and facial expressions.

Authors:  Derek M Isaacowitz; Corinna E Löckenhoff; Richard D Lane; Ron Wright; Lee Sechrest; Robert Riedel; Paul T Costa
Journal:  Psychol Aging       Date:  2007-03

5.  Role of motion signals in recognizing subtle facial expressions of emotion.

Authors:  Emma Bould; Neil Morris
Journal:  Br J Psychol       Date:  2007-05-04

6.  Age differences in emotion recognition skills and the visual scanning of emotion faces.

Authors:  Susan Sullivan; Ted Ruffman; Sam B Hutton
Journal:  J Gerontol B Psychol Sci Soc Sci       Date:  2007-01       Impact factor: 4.077

Review 7.  Visual search for faces with emotional expressions.

Authors:  Alexandra Frischen; John D Eastwood; Daniel Smilek
Journal:  Psychol Bull       Date:  2008-09       Impact factor: 17.737

8.  Decline or improvement? Age-related differences in facial expression recognition.

Authors:  Atsunobu Suzuki; Takahiro Hoshino; Kazuo Shigemasu; Mitsuru Kawamura
Journal:  Biol Psychol       Date:  2006-08-28       Impact factor: 3.251

9.  Emotion communication skills in young, middle-aged, and older women.

Authors:  C Z Malatesta; C E Izard; C Culver; M Nicolich
Journal:  Psychol Aging       Date:  1987-06

10.  What causes the face inversion effect?

Authors:  M J Farah; J W Tanaka; H M Drain
Journal:  J Exp Psychol Hum Percept Perform       Date:  1995-06       Impact factor: 3.332

View more
  1 in total

Review 1.  What is the Value of Embedding Artificial Emotional Prosody in Human-Computer Interactions? Implications for Theory and Design in Psychological Science.

Authors:  Rachel L C Mitchell; Yi Xu
Journal:  Front Psychol       Date:  2015-11-12
  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.