Literature DB >> 21665198

Investigating joint attention mechanisms through spoken human-robot interaction.

Maria Staudte1, Matthew W Crocker.   

Abstract

Referential gaze during situated language production and comprehension is tightly coupled with the unfolding speech stream (Griffin, 2001; Meyer, Sleiderink, & Levelt, 1998; Tanenhaus, Spivey-Knowlton, Eberhard, & Sedivy, 1995). In a shared environment, utterance comprehension may further be facilitated when the listener can exploit the speaker's focus of (visual) attention to anticipate, ground, and disambiguate spoken references. To investigate the dynamics of such gaze-following and its influence on utterance comprehension in a controlled manner, we use a human-robot interaction setting. Specifically, we hypothesize that referential gaze is interpreted as a cue to the speaker's referential intentions which facilitates or disrupts reference resolution. Moreover, the use of a dynamic and yet extremely controlled gaze cue enables us to shed light on the simultaneous and incremental integration of the unfolding speech and gaze movement. We report evidence from two eye-tracking experiments in which participants saw videos of a robot looking at and describing objects in a scene. The results reveal a quantified benefit-disruption spectrum of gaze on utterance comprehension and, further, show that gaze is used, even during the initial movement phase, to restrict the spatial domain of potential referents. These findings more broadly suggest that people treat artificial agents similar to human agents and, thus, validate such a setting for further explorations of joint attention mechanisms.
Copyright © 2011 Elsevier B.V. All rights reserved.

Entities:  

Mesh:

Year:  2011        PMID: 21665198     DOI: 10.1016/j.cognition.2011.05.005

Source DB:  PubMed          Journal:  Cognition        ISSN: 0010-0277


  14 in total

1.  Modeling Intensive Polytomous Time-Series Eye-Tracking Data: A Dynamic Tree-Based Item Response Model.

Authors:  Sun-Joo Cho; Sarah Brown-Schmidt; Paul De Boeck; Jianhong Shen
Journal:  Psychometrika       Date:  2020-02-21       Impact factor: 2.500

2.  Perception is Only Real When Shared: A Mathematical Model for Collaborative Shared Perception in Human-Robot Interaction.

Authors:  Marco Matarese; Francesco Rea; Alessandra Sciutti
Journal:  Front Robot AI       Date:  2022-06-15

3.  See You See Me: the Role of Eye Contact in Multimodal Human-Robot Interaction.

Authors:  Tian Linger Xu; Hui Zhang; Chen Yu
Journal:  ACM Trans Interact Intell Syst       Date:  2016-05

4.  Seeing minds in others: Mind perception modulates low-level social-cognitive performance and relates to ventromedial prefrontal structures.

Authors:  Eva Wiese; George A Buzzell; Abdulaziz Abubshait; Paul J Beatty
Journal:  Cogn Affect Behav Neurosci       Date:  2018-10       Impact factor: 3.282

5.  Can Speaker Gaze Modulate Syntactic Structuring and Thematic Role Assignment during Spoken Sentence Comprehension?

Authors:  Pia Knoeferle; Helene Kreysa
Journal:  Front Psychol       Date:  2012-12-05

6.  Perceiving where another person is looking: the integration of head and body information in estimating another person's gaze.

Authors:  Pieter Moors; Filip Germeys; Iwona Pomianowska; Karl Verfaillie
Journal:  Front Psychol       Date:  2015-06-30

7.  Eyes on the mind: investigating the influence of gaze dynamics on the perception of others in real-time social interaction.

Authors:  Ulrich J Pfeiffer; Leonhard Schilbach; Mathis Jording; Bert Timmermans; Gary Bente; Kai Vogeley
Journal:  Front Psychol       Date:  2012-12-03

8.  Spatial and temporal attention modulate the early stages of face processing: behavioural evidence from a reaching paradigm.

Authors:  Genevieve L Quek; Matthew Finkbeiner
Journal:  PLoS One       Date:  2013-02-28       Impact factor: 3.240

9.  Theory of mind: mechanisms, methods, and new directions.

Authors:  Lindsey J Byom; Bilge Mutlu
Journal:  Front Hum Neurosci       Date:  2013-08-08       Impact factor: 3.169

10.  Robot initiative in a team learning task increases the rhythm of interaction but not the perceived engagement.

Authors:  Serena Ivaldi; Salvatore M Anzalone; Woody Rousseau; Olivier Sigaud; Mohamed Chetouani
Journal:  Front Neurorobot       Date:  2014-02-17       Impact factor: 2.650

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.