| Literature DB >> 22787454 |
Daniel Perez-Marcos1, Massimiliano Solazzi, William Steptoe, Oyewole Oyekoya, Antonio Frisoli, Tim Weyrich, Anthony Steed, Franco Tecchia, Mel Slater, Maria V Sanchez-Vives.
Abstract
Although telerehabilitation systems represent one of the most technologically appealing clinical solutions for the immediate future, they still present limitations that prevent their standardization. Here we propose an integrated approach that includes three key and novel factors: (a) fully immersive virtual environments, including virtual body representation and ownership; (b) multimodal interaction with remote people and virtual objects including haptic interaction; and (c) a physical representation of the patient at the hospital through embodiment agents (e.g., as a physical robot). The importance of secure and rapid communication between the nodes is also stressed and an example implemented solution is described. Finally, we discuss the proposed approach with reference to the existing literature and systems.Entities:
Keywords: body ownership; haptics; multisensory correlations; neurorehabilitation; rubber hand illusion; telemedicine; teleneurology; virtual reality
Year: 2012 PMID: 22787454 PMCID: PMC3392697 DOI: 10.3389/fneur.2012.00110
Source DB: PubMed Journal: Front Neurol ISSN: 1664-2295 Impact factor: 4.003
Figure 1Set-up at the patient’s home. The patient wears a HMD with head-tracking for immersion in the virtual environment from a first-person perspective of the avatar representing him. Wireless body tracking allows control over the avatar’s movements. A haptic device with force-feedback is used for tactile interaction with the environment and/or remote persons. Finally, several sensors and electrodes are attached to the patient to monitor his physiological and emotional state.
Figure 5The doctor’s office. At the current set-up, the patient is displayed in a life-sized screen. A haptic device with force-feedback is used for tactile interaction with the patient. A PC screen is used for monitoring patient’s physiological data.
Figure 2Haptic interaction set-up in the mixed mode. Objects displayed with black dot lines represent virtual objects seen through the HMD by the patient; objects drawn with continuous lines represent local objects at the corresponding side.
Figure 3Person-to-person haptic interaction with force-feedback. The remote doctor explores patient’s arm mobility. The patient sees a virtual representation of the real doctor, which gives the hand to him.
Figure 4The ring task. A virtual ring is passed along a virtual wire without touching it. Whenever the wire is touched force-feedback is enabled. Camera view is from patient’s point of view (first-person perspective).