Literature DB >> 33501215

Imitating by Generating: Deep Generative Models for Imitation of Interactive Tasks.

Judith Bütepage1, Ali Ghadirzadeh1,2, Özge Öztimur Karadaǧ1,3, Mårten Björkman1, Danica Kragic1.   

Abstract

To coordinate actions with an interaction partner requires a constant exchange of sensorimotor signals. Humans acquire these skills in infancy and early childhood mostly by imitation learning and active engagement with a skilled partner. They require the ability to predict and adapt to one's partner during an interaction. In this work we want to explore these ideas in a human-robot interaction setting in which a robot is required to learn interactive tasks from a combination of observational and kinesthetic learning. To this end, we propose a deep learning framework consisting of a number of components for (1) human and robot motion embedding, (2) motion prediction of the human partner, and (3) generation of robot joint trajectories matching the human motion. As long-term motion prediction methods often suffer from the problem of regression to the mean, our technical contribution here is a novel probabilistic latent variable model which does not predict in joint space but in latent space. To test the proposed method, we collect human-human interaction data and human-robot interaction data of four interactive tasks "hand-shake," "hand-wave," "parachute fist-bump," and "rocket fist-bump." We demonstrate experimentally the importance of predictive and adaptive components as well as low-level abstractions to successfully learn to imitate human behavior in interactive social tasks.
Copyright © 2020 Bütepage, Ghadirzadeh, Öztimur Karadaǧ, Björkman and Kragic.

Entities:  

Keywords:  deep learning; generative models; human-robot interaction; imitation learning; sensorimotor coordination; variational autoencoders

Year:  2020        PMID: 33501215      PMCID: PMC7806025          DOI: 10.3389/frobt.2020.00047

Source DB:  PubMed          Journal:  Front Robot AI        ISSN: 2296-9144


  8 in total

1.  Joint action: bodies and minds moving together.

Authors:  Natalie Sebanz; Harold Bekkering; Günther Knoblich
Journal:  Trends Cogn Sci       Date:  2006-01-10       Impact factor: 20.229

2.  Correspondence mapping induced state and action metrics for robotic imitation.

Authors:  Aris Alissandrakis; Chrystopher L Nehaniv; Kerstin Dautenhahn
Journal:  IEEE Trans Syst Man Cybern B Cybern       Date:  2007-04

3.  Advances in Variational Inference.

Authors:  Cheng Zhang; Judith Butepage; Hedvig Kjellstrom; Stephan Mandt
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2018-12-25       Impact factor: 6.226

Review 4.  Socially intelligent robots: dimensions of human-robot interaction.

Authors:  Kerstin Dautenhahn
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2007-04-29       Impact factor: 6.237

5.  Anticipating Human Activities Using Object Affordances for Reactive Robotic Response.

Authors:  Hema S Koppula; Ashutosh Saxena
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2016-01       Impact factor: 6.226

6.  Guided participation in cultural activity by toddlers and caregivers.

Authors:  B Rogoff; J Mistry; A Göncü; C Mosier
Journal:  Monogr Soc Res Child Dev       Date:  1993

7.  Early Developments in Joint Action.

Authors:  Celia A Brownell
Journal:  Rev Philos Psychol       Date:  2011-06

Review 8.  Joint Action: Mental Representations, Shared Information and General Mechanisms for Coordinating with Others.

Authors:  Cordula Vesper; Ekaterina Abramova; Judith Bütepage; Francesca Ciardo; Benjamin Crossey; Alfred Effenberg; Dayana Hristova; April Karlinsky; Luke McEllin; Sari R R Nijssen; Laura Schmitz; Basil Wahn
Journal:  Front Psychol       Date:  2017-01-04
  8 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.