| Literature DB >> 36078697 |
Sungwook Jung1, Sung Hee Ahn2, Jiwoong Ha1, Sangwoo Bahn1.
Abstract
Education using humanoid robots can have a positive impact in many fields, including in medical or physical training. This study investigated the effects of robot interactions with respect to facial expressions, gestures, voices and their combinations on the education of the elderly regarding information and communications technology (ICT) from functional and emotional perspectives. In this study, the robot's interaction methods were divided into four categories: (1) voice, (2) voice and expression, (3) voice and gesture, and (4) voice and expression and gesture. An experiment involving an educational application with a humanoid robot was conducted with a total of 15 elderly people over the age of 60. The effect of the humanoid robot's interaction method on education was identified by means of subjective survey evaluation and practice performance data analysis, including error rate, task success rate, and number of retrainings. Through the experiment, functional and emotional aspects of effects were measured. The results showed that performance and perceived effectiveness were not significantly affected by the type of robot interaction, but the degree to which the robot felt like it had emotions, the degree to which the robot felt like a human, and the degree to which the robot was friendly were significantly different according to the interaction type employed by the humanoid robot. The best effect was achieved when voice and gesture were used together during tutoring. Recognizing that ICT education using humanoid robots increases interest and participation in education, such robots are concluded to be a suitable method for performing ICT education. In addition, when designing robotic interactions, the use of the robot's voice and gestures together is expected to lead to greater anthropomorphism, resulting in a stronger relationship with humanoid robots.Entities:
Keywords: education; elderly; human–robot interaction; interaction type; robot
Mesh:
Year: 2022 PMID: 36078697 PMCID: PMC9518593 DOI: 10.3390/ijerph191710988
Source DB: PubMed Journal: Int J Environ Res Public Health ISSN: 1660-4601 Impact factor: 4.614
Figure 1Experimental environment.
Examples of voice feedback.
| Success | Failure |
|---|---|
| Well done. | Was it a little difficult? It’s okay. |
| It is correct. Will you be able to do well alone next time? | Can’t you remember? It’s okay. |
Examples of gestures and facial expressions of the humanoid robot.
| Details | |
|---|---|
| General gestures | (1) Raise and lower both arms diagonally |
| (2) Raise and lower left and right arm alternatively | |
| (3) Raise and lower with one arm bent | |
| (4) Raise and lower one arm | |
| General facial expression | (1) Smile |
| (2) Wink | |
| (3) Blink | |
| (4) Concentrating | |
| (5) Sad | |
| Feedback gestures | (1) Nod and clench a fist (in cases where tasks were successful) |
| (2) Raise hands above head, shaking the body (in cases where tasks were successful) | |
| (3) Shake head from side to side, placing both hands on its chest (in cases where participants failed the task) | |
| Feedback facial expression | (1) Smile (in cases where tasks were successful) |
| (2) Sad (in cases where participants failed the task) |
Figure 2Facial expressions.
Selected functions and sub-functions of the KakaoTalk app.
| Task | Function (Sub-Function) |
|---|---|
| 1 | Creating a chat room (search a specific person/check profile image/create a new chat room) |
| 2 | Sending and saving pictures |
| 3 | Forwarding messages or pictures |
| 4 | Additional features of chat rooms (turn off notifications and invite another person to an existing chat room) |
| 5 | Pinning a specific chat room on the top |
| 6 | Deleting sent messages |
Figure 3Training materials indicating how to find and share a picture.
Figure 4Experimental environment.
Figure 5Experimental procedure.
Descriptive statistics of anthropomorphism and tutoring effect of the humanoid robot (N = 60).
| Item | Mean | SD | |
|---|---|---|---|
| Anthropo-morphism | Q1. Did you feel that the robot had emotions? | 5.25 | 1.772 |
| Q2. Did the robot feel like a human? | 5.35 | 1.614 | |
| Q3. Did you feel familiar with the robot? | 5.80 | 1.338 | |
| Satisfaction | Q4. Was the tutoring interesting? | 6.07 | 1.219 |
| Q5. Were you generally satisfied with the tutoring? | 6.60 | 0.694 | |
| Perceived effectiveness | Q6. Could you understand the content the robot provided well? | 6.33 | 0.857 |
| Q7. Could you focus on the tutoring? | 6.20 | 1.038 | |
| Q8. Do you think that you can learn through the robot? | 6.55 | 0.675 | |
| Q9. Do you think robots can educate? | 6.38 | 0.922 | |
Figure 6Descriptive statics of anthropomorphism and tutoring effect of the humanoid robot (N = 60).
The results of the Kruskal–Wallis test on the subjective evaluation according to interaction type.
| Item | Kruskal–Wallis H | ||
|---|---|---|---|
| Anthropo-morphism | Did you feel that the robot had emotions? | 8.921 | 0.030 * |
| Did the robot feel like a human? | 11.38 | 0.010 * | |
| Did you feel familiar with the robot? | 8.368 | 0.039 * | |
| Satisfaction | Was the tutoring interesting? | 7.620 | 0.055 |
| Were you generally satisfied with the tutoring? | 6.511 | 0.089 | |
| Perceived effectiveness | Could you understand the content the robot provided well? | 0.902 | 0.825 |
| Could you focus on the tutoring? | 1.833 | 0.608 | |
| Do you think that you can learn through the robot? | 2.129 | 0.546 | |
| Do you think robots can educate? | 3.886 | 0.274 | |
* p-value ≤ 0.05.
Figure 7The degree to which the robot was perceived to have emotions by interaction type.
Figure 8The degree to which the robot felt like a human by interaction type.
Figure 9Degree to which the participants felt familiar with the robot by interaction type.
Kruskal–Wallis test on the number of touch errors and retrainings by interaction type.
| Measurement | Interaction Method | N | Mean Rank | df |
| |
|---|---|---|---|---|---|---|
| Touch errors | Voice | 14 | 28.00 | 3 | 0.328 | 0.955 |
| Voice + Facial expression | 14 | 30.54 | ||||
| Voice + Gesture | 14 | 27.64 | ||||
| Voice + Facial expression + Gesture | 14 | 27.82 | ||||
| Retraining | Voice | 14 | 27.07 | 3 | 0.709 | 0.871 |
| Voice + Facial expression | 14 | 30.86 | ||||
| Voice + Gesture | 14 | 28.79 | ||||
| Voice + Facial expression + Gesture | 14 | 27.29 |
Results of the cross-tabulation analysis on success rate according to interaction type.
| Interaction Type | Task Performance |
| ||
|---|---|---|---|---|
| Success | Failure | |||
| Voice | 10(71.4%) | 4(28.6%) | 0.876 | 0.831 |
| Voice + Facial expression | 8(57.1%) | 6(42.9%) | ||
| Voice + Gesture | 9(64.3%) | 5(35.7%) | ||
| Voice + Facial expression + Gesture | 10(71.4%) | 4(28.6%) | ||
Descriptive statistics of the items on bridging the digital divide.
| Item | N | Mean | SD |
|---|---|---|---|
| Do you think training by robot could be helpful in the case of other IT devices? | 14 | 6.3 | 2.11 |
| Was the training by robot burdensome or inconvenient? | 14 | 5.7 | 2.37 |
| Do you think that training by robot can help to build social relationships? | 14 | 6.3 | 1.27 |