Literature DB >> 30853000

Live human-robot interactive public demonstrations with automatic emotion and personality prediction.

Hatice Gunes1, Oya Celiktutan2, Evangelos Sariyanidi3.   

Abstract

Communication with humans is a multi-faceted phenomenon where the emotions, personality and non-verbal behaviours, as well as the verbal behaviours, play a significant role, and human-robot interaction (HRI) technologies should respect this complexity to achieve efficient and seamless communication. In this paper, we describe the design and execution of five public demonstrations made with two HRI systems that aimed at automatically sensing and analysing human participants' non-verbal behaviour and predicting their facial action units, facial expressions and personality in real time while they interacted with a small humanoid robot. We describe an overview of the challenges faced together with the lessons learned from those demonstrations in order to better inform the science and engineering fields to design and build better robots with more purposeful interaction capabilities. This article is part of the theme issue 'From social brains to social robots: applying neurocognitive insights to human-robot interaction'.

Entities:  

Keywords:  affect; facial action units; facial expressions; personality; public demonstration; real-time human–robot interaction

Mesh:

Year:  2019        PMID: 30853000      PMCID: PMC6452249          DOI: 10.1098/rstb.2018.0026

Source DB:  PubMed          Journal:  Philos Trans R Soc Lond B Biol Sci        ISSN: 0962-8436            Impact factor:   6.237


  5 in total

1.  Facial expressions of emotion are not culturally universal.

Authors:  Rachael E Jack; Oliver G B Garrod; Hui Yu; Roberto Caldara; Philippe G Schyns
Journal:  Proc Natl Acad Sci U S A       Date:  2012-04-16       Impact factor: 11.205

2.  Automatic Analysis of Facial Affect: A Survey of Registration, Representation, and Recognition.

Authors:  Evangelos Sariyanidi; Hatice Gunes; Andrea Cavallaro
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2015-06       Impact factor: 6.226

3.  The resolution of facial expressions of emotion.

Authors:  Shichuan Du; Aleix M Martinez
Journal:  J Vis       Date:  2011-11-30       Impact factor: 2.240

4.  Muecas: a multi-sensor robotic head for affective human robot interaction and imitation.

Authors:  Felipe Cid; Jose Moreno; Pablo Bustos; Pedro Núñez
Journal:  Sensors (Basel)       Date:  2014-04-28       Impact factor: 3.576

5.  The Hawthorne Effect: a randomised, controlled trial.

Authors:  Rob McCarney; James Warner; Steve Iliffe; Robbert van Haselen; Mark Griffin; Peter Fisher
Journal:  BMC Med Res Methodol       Date:  2007-07-03       Impact factor: 4.615

  5 in total
  3 in total

1.  From social brains to social robots: applying neurocognitive insights to human-robot interaction.

Authors:  Emily S Cross; Ruud Hortensius; Agnieszka Wykowska
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2019-04-29       Impact factor: 6.237

2.  The Facial Action Coding System for Characterization of Human Affective Response to Consumer Product-Based Stimuli: A Systematic Review.

Authors:  Elizabeth A Clark; J'Nai Kessinger; Susan E Duncan; Martha Ann Bell; Jacob Lahne; Daniel L Gallagher; Sean F O'Keefe
Journal:  Front Psychol       Date:  2020-05-26

3.  Mind Your Manners! A Dataset and a Continual Learning Approach for Assessing Social Appropriateness of Robot Actions.

Authors:  Jonas Tjomsland; Sinan Kalkan; Hatice Gunes
Journal:  Front Robot AI       Date:  2022-03-09
  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.