| Literature DB >> 28966875 |
Tian Linger Xu1, Hui Zhang2, Chen Yu3.
Abstract
We focus on a fundamental looking behavior in human-robot interactions - gazing at each other's face. Eye contact and mutual gaze between two social partners are critical in smooth human-human interactions. Therefore, investigating at what moments and in what ways a robot should look at a human user's face as a response to the human's gaze behavior is an important topic. Toward this goal, we developed a gaze-contingent human-robot interaction system, which relied on momentary gaze behaviors from a human user to control an interacting robot in real time. Using this system, we conducted an experiment in which human participants interacted with the robot in a joint attention task. In the experiment, we systematically manipulated the robot's gaze toward the human partner's face in real time and then analyzed the human's gaze behavior as a response to the robot's gaze behavior. We found that more face looks from the robot led to more look-backs (to the robot's face) from human participants and consequently created more mutual gaze and eye contact between the two. Moreover, participants demonstrated more coordinated and synchronized multimodal behaviors between speech and gaze when more eye contact was successfully established and maintained.Entities:
Keywords: Gaze-Based Interaction; Human-Robot Interaction; Multimodal Interface
Year: 2016 PMID: 28966875 PMCID: PMC5618804 DOI: 10.1145/2882970
Source DB: PubMed Journal: ACM Trans Interact Intell Syst ISSN: 2160-6455