| Literature DB >> 35194482 |
Matteo Luperto1, Javier Monroy2, Jennifer Renoux3, Francesca Lunardini4, Nicola Basilico1, Maria Bulgheroni5, Angelo Cangelosi6, Matteo Cesari7, Manuel Cid8, Aladar Ianes9, Javier Gonzalez-Jimenez2, Anastasis Kounoudes10, David Mari11, Victor Prisacariu12, Arso Savanovic13, Simona Ferrante4, N Alberto Borghese1.
Abstract
The integration of Ambient Assisted Living (AAL) frameworks with Socially Assistive Robots (SARs) has proven useful for monitoring and assisting older adults in their own home. However, the difficulties associated with long-term deployments in real-world complex environments are still highly under-explored. In this work, we first present the MoveCare system, an unobtrusive platform that, through the integration of a SAR into an AAL framework, aimed to monitor, assist and provide social, cognitive, and physical stimulation in the own houses of elders living alone and at risk of falling into frailty. We then focus on the evaluation and analysis of a long-term pilot campaign of more than 300 weeks of usages. We evaluated the system's acceptability and feasibility through various questionnaires and empirically assessed the impact of the presence of an assistive robot by deploying the system with and without it. Our results provide strong empirical evidence that Socially Assistive Robots integrated with monitoring and stimulation platforms can be successfully used for long-term support to older adults. We describe how the robot's presence significantly incentivised the use of the system, but slightly lowered the system's overall acceptability. Finally, we emphasise that real-world long-term deployment of SARs introduces a significant technical, organisational, and logistical overhead that should not be neglected nor underestimated in the pursuit of long-term robust systems. We hope that the findings and lessons learned from our work can bring value towards future long-term real-world and widespread use of SARs.Entities:
Keywords: Ambient Assisted Living; IoT network; Monitoring; Socially assistive robots; Virtual communities
Year: 2022 PMID: 35194482 PMCID: PMC8853423 DOI: 10.1007/s12369-021-00843-0
Source DB: PubMed Journal: Int J Soc Robot ISSN: 1875-4791 Impact factor: 5.126
Fig. 1High-level overview of the MoveCare system. The system is composed of three components installed in the user’s home (the smart objects, the environmental sensors, and the service robot) and two components deployed in the cloud (the Virtual Caregiver and the Community-Based Activity Centre (CBAC)). Most of the communication between the components is performed through an MQTT Gateway. In addition, some communication between components in the cloud is performed through RESTful APIs. The users and human caregivers interact with the system through various interfaces
Fig. 2The functionalities implemented in the MoveCare platform with the components involved in their realisation
Detailed description of the MoveCare Scenarios
| Scenario | Description | Changes in without-robot |
|---|---|---|
| Call for Help | The user, in case of an emergency, can ask for help through the distributed microphones; the message is forwarded to the robot, who tries to locate the user; when the user is found, the robot asks for a confirmation of the emergency situation. If the situation is confirmed (or if there is no answer from the user and a timeout expires), the IoT concentrator calls the user’s caregivers through a phone call. When a caregiver answers to the call and decides to handle the emergency situation, a link is sent to them to access, upon authentication, a teleoperation session on the robot. From a web interface, the caregivers can teleoperate the robot. The user can refuse the teleoperation session, or close it, by pressing on the red button on the robot or by voice commands through the microphones. When the emergency situation is closed, a report is created. | After the user asks for help, the IoT concentrator immediately calls the caregiver on his/her phone until a caregiver accepts the emergency. After a caregiver do so, the scenario is closed. |
| Body Weight Measurement | Users are expected to weight themselves regularly. After a few days where no data are collected, the robot makes an intervention and asks the user to do a weight measurement. After the measure is received, the robot say a ‘thank you’ message to the user. | If no data are collected after a certain amount of time, a notification to perform a weight measurement is shown to the user on the CBAC (tablet or TV set-top box). |
| Neuropsycho- logical Tests | The robot approaches the user and asks them to do a cognitive test on the tablet. When the user opens the tablet, a wait page is shown. The robot then re-identifies the user and, when found, moves close to them. The robot asks the user if they is ready. If so, it triggers the start of the cognitive tests that are performed on the tablet. The robot interacts with the user during the execution by mimicking the behaviour of a caregiver during the execution of such tests (e.g., by giving explanation). When the scenario is completed, the robot thanks the user. | The scenario is not performed in the without-robot setting. |
| Spot Questions | The robot starts and intervention and approaches the user. The robot asks the user if they have some time to answer a question. If so, the robot asks a question to the user; questions have been developed to assess different cognitive abilities (e.g., ‘what day of the week is today?’). | The scenario is not performed in the without-robot setting. |
| Grip Force Measurements | The user should do an exergame with the smart ball regularly; if data are missing, the robot performs an intervention by asking the user to either play with the smart ball exergames or by checking the battery level of the smart ball. | If no data are collected after a certain amount of time, a notification to perform a smart ball exergames is shown to the user on the CBAC. |
| Use of Smart Objects | The user should use the smart pen or the smart ball (as a standalone object) regularly; if data are missing, the robot performs an intervention by asking the user to use a smart objects or to check their battery level. | If no data are collected after a certain amount of time, a notification to use the smart object is shown to the user on the CBAC. |
| Gentle Exercises | Users should do a gentle exercise session regularly; if no sessions are recorded for a given user, the robot performs an intervention by reminding to the user to use the Gentle Exercise activity. | A notification to do gentle exercise is shown on the CBAC . Also, the activity is prioritised in the CBAC interface. |
| Balance Exergames | Users should play with balance exergames regularly; if no sessions are recorded for a given user, the robot performs an intervention by reminding to the user to use them. | A notification to do exergames is shown on the CBAC. Also, the activity is prioritised in the CBAC interface. |
| Outdoor Suggestions | Users should do outdoor activity sessions regularly; if such sessions are not recorded with the insoles, the robot performs an intervention to remind the user to do such activities with the insoles, as well as to check those suggested on the CBAC. | A notification to do outdoor activities is shown on the CBAC. Also, the list of suggested activities is prioritised in the CBAC interface. |
| Finding Lost Objects | The user, on the CBAC, asks the robot to locate one of the RFID tagged objects. The robot searches for the object inside the house. At the location where the object is found, the robot reports to the user by saying ’I found the object’. Otherwise, the robot signals that the object was not found. | The scenario is not performed in the without-robot setting. |
A list of the scenarios proposed by the MoveCare framework, describing the functionality they address and the components involved. A full description of the scenarios is provided in Appendix 10
Fig. 3The Giraff-X mobile robot
Fig. 4Example of an annotated map of the working environment. Blue circles with a logo represent topological places such as rooms and hallways, yellow circles correspond to doorways, and the smaller light-blue circles are topological locations inside the rooms that the robot can reach during navigation. The robot’s charging station positions is marked in green, while red and dark-blue marks indicate the position of the environmental sensors
Fig. 5The smart objects of MoveCare: (a) the anti-stress ball, (b) the ink pen, (c) the insoles, and (d) the Bluetooth scale
Fig. 6A participant to the pilot playing with the CBAC on the tablet setup
Fig. 7The architecture of the Virtual Caregiver. All the modules implementing the periodic workflows query data from the system’s central database. This connection has been omitted from the figure for the sake of readability
Rules and Constraints implemented in the Orchestrator
Fig. 8Giraff-X carrying out various tasks inside the apartment of the pilot experimental campaign
List of users divided by group and by pilot round. are users from ES-HOME, are users from ITA-HOME, while are users from ITA-AL
Individual answers to the Satisfaction Questionnaire by all users
; ; ; ; ;
Satisfaction questionnaire. M is the median, IQ is the interquartile range, RO is with-robot and NO is without-robot
Overview of the pilot condition for each user and results of the SUS questionnaire. (Customarily SUS is considered as positively evaluated when the score is above 68)
Fig. 9Scatter plots of the SUS score, the MMSE, and age; users with MMSE below 28 are in red while others are in green
Fig. 10Reminders and usage of the system
SUS Qustionnaire [6]. Note that the SUS questionnaire the odd numbered questions express positive attitudes, while the even ones express negative attitudes. RO is with-robot and NO is without-robot
System Validation Questionnaire. M is Median, IQ is InterQuartile range, N is the number of pilot participants who answered to that question
| System Validation Questionnaire | |||
|---|---|---|---|
| Question | |||
| Q1 | Rate your satisfaction with MoveCare in relation to its contribution to the improvement of your everyday life. | 4 | 1.25 |
| Q2 | Rate your satisfaction with MoveCare in relation to the adaptability in the spaces you spend your everyday life. | 4 | 1.25 |
| Q3 | Rate your satisfaction with MoveCare in relation to how safe it is. | 4 | 1.25 |
| Q4 | Rate your satisfaction with MoveCare in relation to the degree to which the system meets your needs. | 3 | 2 |
| Q5 | Rate your satisfaction with MoveCare in relation to the responsiveness of the system to your inputs. | 3 | 2 |
| Q6 | Rate your satisfaction with MoveCare in relation to the reliability of the system. | 3 | 2 |
| Q7 | I will feel more confident when using MoveCare. | 4 | 1.5 |
| Q8 | I will feel more connected with the external world when using MoveCare. | 4 | 3 |
| Q9 | I will feel at ease when using MoveCare around friends and family. | 4 | 1 |
| Q10 | MoveCare allowed me to establish new social connections with the external world. | 3 | 3 |
| Q11 | Rate your satisfaction with MoveCare in relation to the ease of learning all individual functions. | 4 | 3 |
| Q12 | Rate your satisfaction with MoveCare in relation to the ease of interacting with the system. | 4 | 1.25 |
| Q13 | Rate your satisfaction with MoveCare in relation to the ease of use. | 4 | 1.25 |
| Q14 | I will feel more autonomous when using MoveCare. | 3 | 3 |
| Q15 | I will feel comfortable when using MoveCare. | 4 | 2.25 |
| Q16 | I will feel like to have control over the system when using MoveCare. | 3 | 2.5 |
Detailed list of all the components that compose the MoveCare framework and of their use within the system
| Component | Use |
|---|---|
| Giraff-X | Socially Assistitive Robot |
| Giraff | Main robot platform; previously used as a telepresence robot. Redeveloped for the project running ROS and as autonomous. |
| (top) RBGD camera | Robot’s sensor. Used for obstacle avoidance, user detection, navigation. |
| (lower)RBGD camera | Robot’s sensor. Used for people detection and as main robot microphone. |
| Hokuyo URG lidar | Robot’s sensor. Used for SLAM, navigation. |
| nVidia Jetson | Robot’s component. Additional computation for the robot (user detection). |
| RFID reader | Robot’s sensor. Used for Search for Lost Objects scenario. |
| Docking station | Placed inside the house in an easily accessible position, it is the resting place for the robot when charging. When the robot has no active intervention to perform, it stays connected to the docking station. |
| RFID tags | Placed on a set of objects (keys, glasses, remote controller, wallet), the robot uses them to track the object upon user request. |