| Literature DB >> 35162909 |
Abstract
For a service robot to serve travelers at an airport or for a social robot to live with a human partner at home, it is vital for robots to possess the ability to empathize with human partners and express congruent emotions accordingly. We conducted a systematic review of the literature regarding empathy in interpersonal, virtual agents, and social robots research with inclusion criteria to analyze empirical studies in a peer-reviewed journal, conference proceeding, or a thesis. Based on the review, we define empathy for human-robot interaction (HRI) as the robot's (observer) capability and process to recognize the human's (target) emotional state, thoughts, and situation, and produce affective or cognitive responses to elicit a positive perception of humans. We reviewed all prominent empathy theories and established a conceptual framework that illuminates critical components to consider when designing an empathic robot, including the empathy process, outcome, and the observer and target characteristics. This model is complemented by empirical research involving empathic virtual agents and social robots. We suggest critical factors such as domain dependency, multi-modality, and empathy modulation to consider when designing, engineering, and researching empathic social robots.Entities:
Keywords: affect; emotion; empathy; human–robot interaction; social robot; virtual human
Mesh:
Year: 2022 PMID: 35162909 PMCID: PMC8835506 DOI: 10.3390/ijerph19031889
Source DB: PubMed Journal: Int J Environ Res Public Health ISSN: 1660-4601 Impact factor: 3.390
The number of articles screened, assessed, and included for a review.
| Interpersonal | Human–Agent | Human–Robot | |
|---|---|---|---|
| Abstract Screened | 1116 | 128 | 76 |
| Full-text Assessed | 232 | 27 | 21 |
| Studies Included | 70 | 10 | 12 |
Definitions of empathy.
| Emphasis on | Author(s) | Definition |
|---|---|---|
| Affective | [ | “The vicarious experiencing of an emotion that is congruent with, but not necessarily identical to, the emotion of another individual (p. 146).” |
| “One specific set of congruent emotions, those feelings that are more other-focused than self-focused.” | ||
| [ | “An affective response that stems from the apprehension or comprehension of another’s emotional state or condition, and which is similar to what the other person is feeling or would be expected to feel (p. 71).” | |
| [ | “Consists of a sort of ‘mimicking’ of one person’s affective state by that of another.” | |
| [ | “An affective response more appropriate to another’s situation than one’s own (p. 4).” | |
| [ | “Feeling what another person feels because something happens to them which does not require understanding another’s internal states (p. 411–412).” | |
| Cognitive | [ | “The imaginative transposing of oneself into the thinking, feeling, and acting of another (p. 343).” |
| [ | “A form of complex psychological inference in which observation, memory, knowledge, and reasoning are combined to yield insights into the thoughts and feelings of others (p. 2).” | |
| [ | “Ability to put yourself in the other person’s position, establish rapport, and anticipate his reaction, feelings, and behaviors (p. 269).” | |
| Affective and Cognitive | [ | “The capacity to understand and enter into another person’s feelings and emotions or to experience something from the other person’s point of view (p. 248).” |
| [ | “A set of constructs having to do with the responses of one individual to the experiences of another. These constructs include the processes taking place within the observer and the affective and non-affective outcome which result from those processes (p. 12).” | |
| [ | “The capacities to resonate with another person’s emotion, understand his/her thoughts and feelings, separate our own thoughts and emotions from those of the observed and responding with the appropriate prosocial and helpful behavior (p. 201).” |
Figure 1Conceptual model of empathy of HRI.
Studies on empathic virtual agents.
| Author | Purpose | Observer | Target | Relationship | Situation | Results |
|---|---|---|---|---|---|---|
| [ | To increase the level of social engagement | Agent competitor with neutral, self-centered, empathy condition | Participant | Competitive power relationship—fear and anger | Cards game | Participants in empathic conditions felt less lonely, perceived the agent as more caring, attractive, and more human-like but more stressed |
| [ | To improve long-term relationship quality | Exercise advisor with or without empathic relationship building skills | Exercise client | Advisor–client | Daily conversation on target’s physical activity for a month | Participants respected, liked, and trusted the empathic agent more and wished continued interaction |
| [ | To understand factors modulating agent’s empathic behavior | Agent EMMA | Agent MAX | Three-way conversation among EMMA, MAX, a participant | Participants liked the agent that empathizes the other agent more | |
| [ | For the positive perception of agents | Agent (photographic human face) game player with self-oriented or empathy condition | Participant game player | Co-present gamer (not a competition) | Each plays blackjack with a dealer (split-screen) | Participants liked, trusted, and perceived caring, and felt more supported by the empathic agent |
| [ | To change health-related behaviors (alcohol consumption, exercising, drug use) | 3D personalized on-demand virtual counselor | Participant counselee | Counselor–counselee | Behavioral change in health interventions | Participants accepted and enjoyed the empathic agent more and showed an intention to use the system longer |
| [ | To investigate the effects of parallel and reactive virtual agent responses | Six agents with a | Member of a research team on an island | Inhabitant—researcher | Participants solve a mystery on an island while interacting with agents | A model was induced from positively perceived agent responses in terms of appropriateness and effectiveness |
| [ | To investigate the effects of dialogue agent with beliefs, uncertainties, and intentions | Expressive 3D taking head | Email user | Assistant—user | Participants converse with an agent to find out information on their mail | Participants perceived the non-congruent agent more negatively |
| [ | To support job-seekers preparing for an interview by reducing their stress levels | Mail companion agent in a suit invisible to the interviewer agent | Participant | Companion | Job Interview | Participant’s stress level was reduced by empathic feedback |
| [ | To establish a generic computational model of empathy | Four virtual agents interacting (can be either an observer or a target) | Four virtual agents | Relationships among agents were varied in | A short narrative consists of virtual agents interacting (compliment, criticize) at a schoolyard | Participants evaluated virtual agent-agent interactions from a video. They perceived virtual agents applied with an empathy model more positively, especially with an agent who carried out a prosocial behavior (comforting) |
Empathic recognition and responses of virtual agent studies.
| Author | Empathy Recognition | Process | Outcome | Empathy Responses |
|---|---|---|---|---|
| [ | Affective states | Cognitive | Affective | Facial expression and nonverbal voice (grunts, moans) |
| [ | Situation | Cognitive | Affective | TTS voice (“I’m sorry to hear that”), synchronized hand gestures, posture, gaze |
| [ | Affective states | Affective | Affective | Facial expression, speech prosody, verbal utterance |
| [ | Situation | Cognitive | Affective | Facial expression |
| [ | Affective states | Affective | Affective | TTS voice, nonverbal (head nod, direction) |
| [ | Affective states | Affective | Affective | One or two sentences of text responses |
| [ | Situation | Cognitive | Affective | Facial expression, text responses |
| [ | Affective states | Cognitive | Cognitive | Text responses (“It seems you did not like this question so much.”) |
| [ | Affective states | Affective | Affective | Facial expression, text responses |
Studies on empathic robots.
| Author | Purpose | Observer | Target | Relationship | Situation | Measures and Results |
|---|---|---|---|---|---|---|
| [ | To investigate attitudes toward a robot with accurate or inaccurate empathy | A robot with a synthetic female voice | A male user | Collaborator | A male user and a robot played an online collaborative game. | Participants viewed a video of a robot emphasizing a user. Their trust decreased when the robot’s empathic responses were incongruent with the user. |
| [ | To evaluate the acceptance of mimicked emotion | A robot mimics the target’s voice and does facial expressions with parallel emotion | Human participant | Not defined | Participants read an emotion-embedded story. | Participants perceived the robot’s mimicking response to be more adequate and human-like than the neutral response. |
| [ | To investigate the effects of robot’s empathic responses when the relationship was varied | A robot reacts to the player’s chess moves empathically to a player and neutrally to the other | Two participants | Relationship between the robot and each participant was varied | Two humans played chess. | Participants perceived the empathic robot as being friendlier than the non-empathic one. |
| [ | To evaluate an empathic model for social robots interacting with children | A robot reacts to the children’s chess move based on the empathic appraisal of children’s affect and the game’s state | Children | Not defined | A child played chess against the robot. | Participants responded positively in social presence, engagement, help, and self-validation when interacting with a robot and remained similar after five weeks. |
| [ | To understand human’s perception of the robot’s imitation | A robot with a full head gesture mimicking, partial mimicking (nodding), and non-mimicking | Human participant | Not defined | Participants described non-emotional personal statements and salient personal experience. | Male participants made more gestures than women while interacting with the robot. |
| [ | To understand human’s perception of robot speech | A robot conversed with the participants in three situations (greeting, medicine reminder, guiding the user to use the touch interface) | Human | Not defined | Participants interacted with a Healthbot as a patient. | Participants were able to perceive empathy and emotions in robot speech. They preferred it over the standard robotic voice. |
| [ | To develop a deep learning model for a social robot that mirrors humans | A robot with a display that animates facial expressions | Human | Not defined | Participants conversed with the robot with various facial expressions. | Participants’ interaction data were used to train the model. |
| [ | To evaluate a robot with an empathy model that simulates advanced empathy (i.e., reactive emotions) | A robot (Pepper) embedded with the proposed Autonomous Cognitive Empathic Model that expresses parallel and reactive emotions | Human | Not defined | Participants watched emotion-eliciting videos on the robot’s tablet and interacted with the robot. | Participants’ responses were better to a robot embedded with the proposed model than the baseline model in terms of social and friendship constructs. |
| [ | To evaluate a robot with a deep hybrid neural model for multimodal affect recognition | A robot embedded with a model that simulates intrinsic emotions (i.e., mood) | Human | Not defined | Participants told a story portraying different emotional contexts to the robot. | Independent annotators rated the robot higher in performance (i.e., the accuracy of empathic emotion) than the participants. |
| [ | To evaluate a robot with an empathy model that draws participant’s attention when inattentive | A robot (Pepper) embedded with the attention-based empathic module to hold participant’s attention | Human participant | Not defined | Participants responded to a quiz on the robot’s tablet. | Participants perceived the empathic robot as more engaging and empathic and as spending more time than the non-empathic robot. |
Empathic recognition and responses in social robot studies.
| Author | Empathy Recognition | Process | Outcome | Empathy Responses |
|---|---|---|---|---|
| [ | Situation | Cognitive | Affective | Facial expression, verbal responses |
| [ | Affective states | Affective | Affective | Facial expression |
| [ | Situation | Cognitive | Affective | Facial expression, verbal responses |
| [ | Affective states | Cognitive | Affective | Facial expression, verbal responses |
| [ | Affective states | Affective | Affective | Facial expression |
| [ | Situation | Cognitive | Affective | Verbal responses |
| [ | Affective states | Affective | Affective | Facial expression |
| [ | Affective states | Affective | Affective | Facial expression, verbal responses, gestures |
| [ | Affective states | Affective | Affective | Facial expression |
| [ | Situation | Cognitive | Affective | Facial expression, verbal responses, gestures |
Figure 2Empathic capability as a function of the complexity of empathic processes.