Literature DB >> 21206462

One dimensional Turing-like handshake test for motor intelligence.

Amir Karniel1, Guy Avraham, Bat-Chen Peles, Shelly Levy-Tzedek, Ilana Nisky.   

Abstract

In the Turing test, a computer model is deemed to "think intelligently" if it can generate answers that are not distinguishable from those of a human. However, this test is limited to the linguistic aspects of machine intelligence. A salient function of the brain is the control of movement, and the movement of the human hand is a sophisticated demonstration of this function. Therefore, we propose a Turing-like handshake test, for machine motor intelligence. We administer the test through a telerobotic system in which the interrogator is engaged in a task of holding a robotic stylus and interacting with another party (human or artificial). Instead of asking the interrogator whether the other party is a person or a computer program, we employ a two-alternative forced choice method and ask which of two systems is more human-like. We extract a quantitative grade for each model according to its resemblance to the human handshake motion and name it "Model Human-Likeness Grade" (MHLG). We present three methods to estimate the MHLG. (i) By calculating the proportion of subjects' answers that the model is more human-like than the human; (ii) By comparing two weighted sums of human and model handshakes we fit a psychometric curve and extract the point of subjective equality (PSE); (iii) By comparing a given model with a weighted sum of human and random signal, we fit a psychometric curve to the answers of the interrogator and extract the PSE for the weight of the human in the weighted sum. Altogether, we provide a protocol to test computational models of the human handshake. We believe that building a model is a necessary step in understanding any phenomenon and, in this case, in understanding the neural mechanisms responsible for the generation of the human handshake.

Entities:  

Mesh:

Year:  2010        PMID: 21206462      PMCID: PMC3537195          DOI: 10.3791/2492

Source DB:  PubMed          Journal:  J Vis Exp        ISSN: 1940-087X            Impact factor:   1.355


  8 in total

1.  The psychometric function: I. Fitting, sampling, and goodness of fit.

Authors:  F A Wichmann; N J Hill
Journal:  Percept Psychophys       Date:  2001-11

Review 2.  Separate visual pathways for perception and action.

Authors:  M A Goodale; A D Milner
Journal:  Trends Neurosci       Date:  1992-01       Impact factor: 13.837

3.  Perception and Action in Simulated Telesurgery.

Authors:  Ilana Nisky; Assaf Pressman; Carla M Pugh; Ferdinando A Mussa-Ivaldi; Amir Karniel
Journal:  Haptics (2010)       Date:  2010-07

4.  Kinematic characteristics of reaching movements in preterm children with cerebral palsy.

Authors:  Jolanda C van der Heide; Johanna M Fock; Bert Otten; Elisabeth Stremmelaar; Mijna Hadders-Algra
Journal:  Pediatr Res       Date:  2005-03-17       Impact factor: 3.756

5.  Exploring the handshake in employment interviews.

Authors:  Greg L Stewart; Susan L Dustin; Murray R Barrick; Todd C Darnold
Journal:  J Appl Psychol       Date:  2008-09

6.  Handshaking, gender, personality, and first impressions.

Authors:  W F Chaplin; J B Phillips; J D Brown; N R Clanton; J L Stein
Journal:  J Pers Soc Psychol       Date:  2000-07

7.  Kinematic analysis of unimanual reaching and grasping movements in children with hemiplegic cerebral palsy.

Authors:  Louise Rönnqvist; Birgit Rösblad
Journal:  Clin Biomech (Bristol, Avon)       Date:  2006-10-27       Impact factor: 2.063

8.  Visual control of action but not perception requires analytical processing of object shape.

Authors:  Tzvi Ganel; Melvyn A Goodale
Journal:  Nature       Date:  2003-12-11       Impact factor: 49.962

  8 in total
  1 in total

1.  For Motion Assistance Humans Prefer to Rely on a Robot Rather Than on an Unpredictable Human.

Authors:  Ekaterina Ivanova; Gerolamo Carboni; Jonathan Eden; Jorg Kruger; Etienne Burdet
Journal:  IEEE Open J Eng Med Biol       Date:  2020-04-16
  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.