Literature DB >> 29035218

Interacting With Robots to Investigate the Bases of Social Interaction.

Alessandra Sciutti, Giulio Sandini.   

Abstract

Humans show a great natural ability at interacting with each other. Such efficiency in joint actions depends on a synergy between planned collaboration and emergent coordination, a subconscious mechanism based on a tight link between action execution and perception. This link supports phenomena as mutual adaptation, synchronization, and anticipation, which cut drastically the delays in the interaction and the need of complex verbal instructions and result in the establishment of joint intentions, the backbone of social interaction. From a neurophysiological perspective, this is possible, because the same neural system supporting action execution is responsible of the understanding and the anticipation of the observed action of others. Defining which human motion features allow for such emergent coordination with another agent would be crucial to establish more natural and efficient interaction paradigms with artificial devices, ranging from assistive and rehabilitative technology to companion robots. However, investigating the behavioral and neural mechanisms supporting natural interaction poses substantial problems. In particular, the unconscious processes at the basis of emergent coordination (e.g., unintentional movements or gazing) are very difficult-if not impossible-to restrain or control in a quantitative way for a human agent. Moreover, during an interaction, participants influence each other continuously in a complex way, resulting in behaviors that go beyond experimental control. In this paper, we propose robotics technology as a potential solution to this methodological problem. Robots indeed can establish an interaction with a human partner, contingently reacting to his actions without losing the controllability of the experiment or the naturalness of the interactive scenario. A robot could represent an "interactive probe" to assess the sensory and motor mechanisms underlying human-human interaction. We discuss this proposal with examples from our research with the humanoid robot iCub, showing how an interactive humanoid robot could be a key tool to serve the investigation of the psychological and neuroscientific bases of social interaction.

Entities:  

Mesh:

Year:  2017        PMID: 29035218     DOI: 10.1109/TNSRE.2017.2753879

Source DB:  PubMed          Journal:  IEEE Trans Neural Syst Rehabil Eng        ISSN: 1534-4320            Impact factor:   3.802


  5 in total

1.  Resonance as a Design Strategy for AI and Social Robots.

Authors:  James Derek Lomas; Albert Lin; Suzanne Dikker; Deborah Forster; Maria Luce Lupetti; Gijs Huisman; Julika Habekost; Caiseal Beardow; Pankaj Pandey; Nashra Ahmad; Krishna Miyapuram; Tim Mullen; Patrick Cooper; Willem van der Maden; Emily S Cross
Journal:  Front Neurorobot       Date:  2022-04-27       Impact factor: 3.493

2.  The ANEMONE: Theoretical Foundations for UX Evaluation of Action and Intention Recognition in Human-Robot Interaction.

Authors:  Jessica Lindblom; Beatrice Alenljung
Journal:  Sensors (Basel)       Date:  2020-07-31       Impact factor: 3.576

Review 3.  Social Cognition for Human-Robot Symbiosis-Challenges and Building Blocks.

Authors:  Giulio Sandini; Vishwanathan Mohan; Alessandra Sciutti; Pietro Morasso
Journal:  Front Neurorobot       Date:  2018-07-11       Impact factor: 2.650

4.  Human- or object-like? Cognitive anthropomorphism of humanoid robots.

Authors:  Alessandra Sacino; Francesca Cocchella; Giulia De Vita; Fabrizio Bracco; Francesco Rea; Alessandra Sciutti; Luca Andrighetto
Journal:  PLoS One       Date:  2022-07-26       Impact factor: 3.752

5.  Humans adjust their grip force when passing an object according to the observed speed of the partner's reaching out movement.

Authors:  Marco Controzzi; Harmeet Singh; Francesca Cini; Torquato Cecchini; Alan Wing; Christian Cipriani
Journal:  Exp Brain Res       Date:  2018-09-27       Impact factor: 1.972

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.