| Literature DB >> 31258488 |
Yuichiro Yoshikawa1,2, Hirokazu Kumazaki3,4, Yoshio Matsumoto5, Masutomo Miyao6, Mitsuru Kikuchi3, Hiroshi Ishiguro1,2.
Abstract
Establishing a treatment method for individuals with autism spectrum disorder (ASD) not only to increase their frequency or duration of eye contact but also to maintain it after ceasing the intervention, and furthermore generalize it across communication partners, is a formidable challenge. Android robots, which are a type of humanoid robot with appearances quite similar to that of humans, are expected to adapt to the role of training partners of face-to-face communication for individuals with ASD and to create easier experiences transferrable to humans. To evaluate this possibility, four male adolescents with ASD and six without ASD were asked to participate a pilot experiment in which there were consecutive sessions of semistructured conversation where they alternately faced either a human female or a female-type android robot interlocutor five times in total. Although it is limited by the small sample size, the preliminary results of analysis of their fixation pattern during the conversations indicated positive signs; the subjects tended to look more at the face of the android robot than that of the human interlocutor regardless of whether they had ASD. However, the individuals with ASD looked more at the area around the eyes of the android robot than at the human, and also looked less at that of the human than the individuals without ASD did. An increasing tendency of looking at the area around the human eyes, which could be a positive sign of the transferability of the experiences with an android robot to a human interlocutor, was only weakly observed as the sessions progressed.Entities:
Keywords: android robot; autism spectrum disorder; eye contact; eye-gaze tracking; treatment and education
Year: 2019 PMID: 31258488 PMCID: PMC6587013 DOI: 10.3389/fpsyt.2019.00370
Source DB: PubMed Journal: Front Psychiatry ISSN: 1664-0640 Impact factor: 4.157
Figure 1Experimental setup: Human (A) and android (B) rooms. In both rooms, a gaze detection device was placed on a table between the subject and the interlocutor (human or android robot). The computer interface for the operator to control the android was placed behind the android room. Note that the person labeled as a subject is not a participant included in this study. Note that the written informed consents to publish this figure are obtained from persons who appear in the figures.
Figure 2Example visualization of fixation points during conversation with human (A) and android (B) interlocutors. Color map indicates where the subject likely looks at. Note that the written informed consent to publish this figure is obtained from the person who appears in the figure.
Duration in seconds of each session for each group: The number is the mean value and that inside the brackets is its standard deviation.
| Subject type | H1 | A2 | H3 | A4 | H5 |
|---|---|---|---|---|---|
| ASD | 140.2 | 169.9 | 122.4 | 147.1 | 113.6 |
| Non-ASD | 128.0 | 166.8 | 120.7 | 158.3 | 115.1 |
Figure 3Looking-face ratio. The blue circular and black rectangular points indicate the average value among participants in the autism spectrum disorder (ASD) and non-ASD groups, respectively. The bars on the points are the standard deviations.
Figure 4Looking-eye ratio. The blue circular and black rectangular points indicate the average value among participants in the ASD and non-ASD groups, respectively. The bars on the points are the standard deviations.
Figure 5Transitions of looking-eye ratio along with sessions. The blue circular and black rectangular points indicate the average value among participants in the ASD and non-ASD groups, respectively. The bars on the points are the standard deviations.