Literature DB >> 33501303

A Framework for Sensorimotor Cross-Perception and Cross-Behavior Knowledge Transfer for Object Categorization.

Gyan Tatiya1, Ramtin Hosseini1, Michael C Hughes1, Jivko Sinapov1.   

Abstract

From an early age, humans learn to develop an intuition for the physical nature of the objects around them by using exploratory behaviors. Such exploration provides observations of how objects feel, sound, look, and move as a result of actions applied on them. Previous works in robotics have shown that robots can also use such behaviors (e.g., lifting, pressing, shaking) to infer object properties that camera input alone cannot detect. Such learned representations are specific to each individual robot and cannot currently be transferred directly to another robot with different sensors and actions. Moreover, sensor failure can cause a robot to lose a specific sensory modality which may prevent it from using perceptual models that require it as input. To address these limitations, we propose a framework for knowledge transfer across behaviors and sensory modalities such that: (1) knowledge can be transferred from one or more robots to another, and, (2) knowledge can be transferred from one or more sensory modalities to another. We propose two different models for transfer based on variational auto-encoders and encoder-decoder networks. The main hypothesis behind our approach is that if two or more robots share multi-sensory object observations of a shared set of objects, then those observations can be used to establish mappings between multiple features spaces, each corresponding to a combination of an exploratory behavior and a sensory modality. We evaluate our approach on a category recognition task using a dataset in which a robot used 9 behaviors, coupled with 4 sensory modalities, performed multiple times on 100 objects. The results indicate that sensorimotor knowledge about objects can be transferred both across behaviors and across sensory modalities, such that a new robot (or the same robot, but with a different set of sensors) can bootstrap its category recognition models without having to exhaustively explore the full set of objects.
Copyright © 2020 Tatiya, Hosseini, Hughes and Sinapov.

Entities:  

Keywords:  category learning and recognition; development of representations; grounding of knowledge; haptic and tactile perception; multimodal perception and integration

Year:  2020        PMID: 33501303      PMCID: PMC7805839          DOI: 10.3389/frobt.2020.522141

Source DB:  PubMed          Journal:  Front Robot AI        ISSN: 2296-9144


  14 in total

1.  A global geometric framework for nonlinear dimensionality reduction.

Authors:  J B Tenenbaum; V de Silva; J C Langford
Journal:  Science       Date:  2000-12-22       Impact factor: 47.728

Review 2.  Merging the senses into a robust percept.

Authors:  Marc O Ernst; Heinrich H Bülthoff
Journal:  Trends Cogn Sci       Date:  2004-04       Impact factor: 20.229

3.  Haptic dominance in form perception: vision versus proprioception.

Authors:  M A Heller
Journal:  Perception       Date:  1992       Impact factor: 1.490

4.  Coupled Deep Autoencoder for Single Image Super-Resolution.

Authors:  Kun Zeng; Jun Yu; Ruxin Wang; Cuihua Li; Dacheng Tao
Journal:  IEEE Trans Cybern       Date:  2015-11-26       Impact factor: 11.448

5.  Reducing the dimensionality of data with neural networks.

Authors:  G E Hinton; R R Salakhutdinov
Journal:  Science       Date:  2006-07-28       Impact factor: 47.728

6.  Multisensory exploration and object individuation in infancy.

Authors:  Teresa Wilcox; Rebecca Woods; Catherine Chapa; Sarah McCurry
Journal:  Dev Psychol       Date:  2007-03

7.  Modality exclusivity norms for 423 object properties.

Authors:  Dermot Lynott; Louise Connell
Journal:  Behav Res Methods       Date:  2009-05

Review 8.  Benefits of multisensory learning.

Authors:  Ladan Shams; Aaron R Seitz
Journal:  Trends Cogn Sci       Date:  2008-11       Impact factor: 20.229

9.  Bayesian exploration for intelligent identification of textures.

Authors:  Jeremy A Fishel; Gerald E Loeb
Journal:  Front Neurorobot       Date:  2012-06-18       Impact factor: 2.650

10.  Kernel Manifold Alignment for Domain Adaptation.

Authors:  Devis Tuia; Gustau Camps-Valls
Journal:  PLoS One       Date:  2016-02-12       Impact factor: 3.240

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.