| Literature DB >> 35846164 |
Laura Fiorini1,2,3, Luigi Coviello2,3, Alessandra Sorrentino1, Daniele Sancarlo4, Filomena Ciccone5, Grazia D'Onofrio5, Gianmaria Mancioppi2,3, Erika Rovini1, Filippo Cavallo1,2,3.
Abstract
Socially Assistive Robots (SARs) are designed to support us in our daily life as a companion, and assistance but also to support the caregivers' work. SARs should show personalized and human-like behavior to improve their acceptance and, consequently, their use. Additionally, they should be trustworthy by caregivers and professionals to be used as support for their work (e.g. objective assessment, decision support tools). In this context the aim of the paper is dual. Firstly, this paper aims to present and discuss the robot behavioral model based on sensing, perception, decision support, and interaction modules. The novel idea behind the proposed model is to extract and use the same multimodal features set for two purposes: (i) to profile the user, so to be used by the caregiver as a decision support tool for the assessment and monitoring of the patient; (ii) to fine-tune the human-robot interaction if they can be correlated to the social cues. Secondly, this paper aims to test in a real environment the proposed model using a SAR robot, namely ASTRO. Particularly, it measures the body posture, the gait cycle, and the handgrip strength during the walking support task. Those collected data were analyzed to assess the clinical profile and to fine-tune the physical interaction. Ten older people (65.2 ± 15.6 years) were enrolled for this study and were asked to walk with ASTRO at their normal speed for 10 m. The obtained results underline a good estimation (p < 0.05) of gait parameters, handgrip strength, and angular excursion of the torso with respect to most used instruments. Additionally, the sensory outputs were combined in the perceptual model to profile the user using non-classical and unsupervised techniques for dimensionality reduction namely T-distributed Stochastic Neighbor Embedding (t-SNE) and non-classic multidimensional scaling (nMDS). Indeed, these methods can group the participants according to their residual walking abilities.Entities:
Keywords: Multimodal sensors; Robot behavioral model; Social assistive robot; User profiling
Year: 2022 PMID: 35846164 PMCID: PMC9266091 DOI: 10.1007/s12369-022-00901-1
Source DB: PubMed Journal: Int J Soc Robot ISSN: 1875-4791 Impact factor: 3.802
Fig. 1The proposed robot model includes (i) the sensing module (purple box) that collects multimodal data from the robot; (ii) the perception module (the yellow box) that processes and combines the data for the decision-making module (grey box) providing the decision support tool for the caregiver (blue box), and for the interaction module (green box)
The same features could be used to estimate the HRI and to assess the clinical status of the user
| Cues | Features | HRI relationship | Clinical assessment relationship |
|---|---|---|---|
| Body posture & movement | Body orientation | Engagement [ | Depression [ |
| Interpersonal distance | Engagement [ | Age [ | |
| Arm movement | Engagement [ | Hints regarding the axis of apathy-agitation | |
| Gestures | Engagement & Emotion [ | Apathy [ | |
| Gait parameters | Engagement & Emotion [ | Cognitive decline [ | |
| Emotion | Facial expression | Emotion [ | Apathy [ |
| Heart rate fluctuation | Emotion [ | Stress [ | |
| Head orientation | Eye gazing | Engagement [ | Attentional fluctuation [ |
| Muscle strength | Hand grip strength | Physical HRI [ | Frailty [ |
| Voice quality | Tempo | Emotion [ | Neurodegenerative diseases [ |
| Energy | |||
| Pitch | |||
| Verbal message | Repetitions | Engagement & Emotional [ | Hints on mental flexibility and planning (repetitions), signs related to lexical problems (incomplete words) and hesitations (silence) |
| Incomplete words | |||
| Amount of silence |
Fig. 2a A subject during the 10 m walk with the ASTRO robot. The yellow label indicates the sensors embedded on the robot whereas the green label and the green arrows indicated the wearable IMUs sensors. b OpenPose frameworks, the used joints are indicated such as the parameters extracted from the camera
Description, mean value and standard deviation (SD) of the gait features computed for the laser and the IMU data
| Acronym | Features | Data extracted from IMU | Data Extracted from Laser | R | |
|---|---|---|---|---|---|
| GTR | Gait Time of right foot | 46.02 (9.52) s | 44.22 (9.76) s | 0.125 | 0.589a |
| GSTRDR | Number of Strides of right foot | 19.90 (4.89) | 19.6 (5.36) | 0.960 | 0.908a |
| GSTRDLR | Stride Length of right foot | 0.53 (0.14) m | 0.39 (0.06) m | 0.843 | 0.758a |
| GSTRDT) | Stride Time of right foot | 2.32 (0.57) s | 2.22 (0.67) s | 0.827 | 0.654a |
| GSTRDTR_SD | SD of Stride Time of right foot | 0.53 (0.43) s | 0.41 (0.25) s | 0.842 | 0.558a |
| GSWTR | Swing Time of right foot | 0.44 (0.060) s | 0.79 (0.14) s | 0.335 | 0.210 |
| GSWTR_SD | SD of Swing Time of right foot | 0.086 (0.054) s | 0.20 (0.07) s | 0.839 | 0.641a |
| GSTTR | Stance Time of right foot | 1.89 (0.54) s | 1.40 (0.52) s | 0.333 | 0.664a |
| GSTTR_SD | SD of Stance Time of right foot | 0.50 (0.41) s | 0.35 (0.25) s | 0.782 | 0.419a |
| GTL | Gait Time of left foot | 44.62 (10.18) s | 44.22 (9.76) s | 0.661 | 0.601a |
| GSTRDL | Number of Strides of left foot | 19.50 (5.28) s | 19.8 (5.51) | 0.951 | 0.912a |
| GSTRDLL | Stride Length of left t foot | 0.56 (0.19) m | 0.35 (0.06) m | 0.707 | 0.560a |
| GSTRDTL | Stride Time of left foot | 2.33 (0.61) s | 2.23 (0.69) s | 0.796 | 0.777a |
| GSTRDTL_SD | SD of Stride Time of left foot | 0.48 (0.40) s | 0.35 (0.16) s | 0.869 | 0.757a |
| GSWTL | Swing Time of left foot | 0.42 (0.079) s | 0.76 (0.18) s | 0.591 | 0.508a |
| GSWTL_SD | SD of Swing Time of left foot | 0.081 (0.052) s | 0.21 (0.10) s | 0.238 | 0.137 |
| GSTTL | Stance Time of left foot | 1.92 (0.55) s | 1.46 (0.56) s | 0.879 | 0.735a |
| GSTTL_SD | SD of Stance Time of left foot | 0.45 (0.37) s | 0.31 (0.14) s | 0.842 | 0.689a |
The results of the Mann-Withney test (p), the coefficient of the linear regression (R), and the mean absolute error (ɛ) are also reported
aThe linear regression are significant (p < 0.05)
Fig. 3Laser data acquired during the 10 m walking test. The red points are the Toe-off (TO) instances and the red points are the Heel Strike instances. The pink lines are the length of the stride. The GSTRDT, GSWT, and GSTT are also reported
Description, mean value, and standard deviation (SD) of the features related to the selected body joints and the handgrip strength extracted from the RGBD camera and the force sensors respectively
| Acronym | Features | Source | Mean (SD) |
|---|---|---|---|
| m | Shoulder Slope | Camera | 0.043 (0.075) |
| m_SD | SD of the Shoulder Slope | Camera | 0.047(0.016) |
| THETA | Shoulder Inclination | Camera | 2.46 (4.29)° |
| THETA_SD | SD of the Shoulder Inclination | Camera | 2.64 (0.86) ° |
| DIS | Nose-Neck Distance | Camera | 144.64 (14.82) pixel |
| DIS_SD | SD of the Nose-Neck Distance | Camera | 17.29 (7.28) pixel |
| NOSE | Nose Displacement | Camera | 2.97 (0.55) pixel |
| NOSE_SD | SD of the Nose Displacement | Camera | 2.38 (0.35) pixel |
| NECK | Neck Displacement | Camera | 2.88 (0.43) pixel |
| NECK_SD | SD of the Neck Displacement | Camera | 2.05 (0.27) pixel |
| Zmin | Minimum Neck Variation along the z-axis | Camera | 575.30 (71.84) mm |
| Zmax | Maximum Neck Variation along the z-axis | Camera | 794.50 (41.96) mm |
| ZRMSE | Root mean square error of the Neck along the z-axis | Camera | 35.57 (11.47) mm |
| THETAI | Average value of the sternum angular excursion | IMU | 1.08 (1.88) ° |
| THETAI_SD | SD of the value of the sternum angular excursion | IMU | 1.81 (0.79) ° |
| MFR | Maximum Force Right Hand | Force | 76.52 (39.42) N |
| MFL | Maximum Force left Hand | Force | 78.01 (37.06) N |
Fig. 4a Data visualization with the t-SNE method; b Data visualization with the non-classical multidimensional scale