| Literature DB >> 35391829 |
Gaia Risso1,2, Greta Preatoni3, Giacomo Valle3, Michele Marazzi3, Noëlle Moana Bracher3, Stanisa Raspopovic3.
Abstract
The multisensory integration of signals from different senses is crucial to develop an unambiguous percept of the environment and our body. Losing a limb causes drastic changes in the body, sometimes causing pain and distorted phantom limb perception. Despite the debate over why these phenomena arise, some researchers suggested that they might be linked to an impairment of multisensory signals inflow and integration. Therefore, reestablishing optimally integrated sensory feedback could be crucial. The related benefits on sensory performance and body self-representation are still to be demonstrated, particularly in lower-limb amputees. We present a multisensory framework combining Virtual reality and electro-cutaneous stimulation that allows the optimal integration of visuo-tactile stimuli in lower-limb amputees even if nonspatially matching. We also showed that this multisensory stimulation allowed faster sensory processing, higher embodiment, and reductions in phantom limb distortions. Our findings support the development of multisensory rehabilitation approaches, restoring a correct body representation.Entities:
Keywords: Behavioral neuroscience; Bioelectronics; Clinical neuroscience; Techniques in neuroscience
Year: 2022 PMID: 35391829 PMCID: PMC8980810 DOI: 10.1016/j.isci.2022.104129
Source DB: PubMed Journal: iScience ISSN: 2589-0042
Figure 1Optimal multisensory integration in lower-limb amputees
(A) Experimental set-up. The subject is sitting on a chair with his prosthesis leaning straight on a stool while performing the Two-Alternative Forced Choice (2-AFC) task. He is equipped with a head-mounted display that immerses him in a virtual scenario. The electrical stimulator delivers the tactile stimuli on his stump.
(B) Experimental protocol, consisting of five conditions: T = tactile; V = visual; VB = visual-blurred; VT = visuo-tactile; VBT = visuo-blurred-tactile.
(C) Psychometric curves: the x axis represents the bar’s vibrating period (in s). The y axis corresponds to the proportion of responses in which the comparison stimulus was larger than the standard. The Likelihood Ratio statistics were significant for all psychometric functions (p < 0.001), indicating a good fit with the data The vertical dashed line that corresponds to a proportion of 50% of ‘faster’ responses indicates the Point of Subjective Equality (PSE). The distance between the two vertical dashed lines corresponds to the Just Noticeable Difference (JND), which is the difference between the PSE and the bar’s vibrating period that is perceived to be faster than the standard in 84% of the trials JND and PSE are reported on the bottom right of the plot. Upper row: subject 1; Lower row: subject 2.
(D) The vertical bars represents: Single subject unimodal (UM; visual blurred and tactile mean performance) and bimodal (BM; visual-blurred-tactile performance) JNDs and the predicted (MLE = maximum likelihood estimation) JND for the bimodal condition (mean ± MAD). Horizontal bars denote statistically significant differences (False Discovery Rate adjusted bootstrap p value ∗p < 0.05 and ∗∗∗p < 0.001). Upper row: subject 1; lower row: subject 2. Left column: no blur condition. Right column: blur condition.
Figure 2Multisensory integration benefits on sensory performance (A–C) and multisensory stimulation benefits investigation on embodiment and phantom limb representations (E–I)
(A) Protocol. UM, unimodal; BM, bimodal.
(B–D) Results for the discrimination task. Upper row: subject 1. Lower row: subject 2. (B) Reaction times. Boxplots of reaction time (s) for UM (blue) and BM (purple) (mean ± std) (p value ∗∗∗p < 0.001, Wilcoxon signed-rank test). (C) Difference (s) of reaction times between UM and BM. (D) Accuracy (%) of UM (blue) and BM (purple) (mean ± CI) (Wilcoxon signed-rank test).
(E) VR Environment. The subject sees his legs (both intact) and one foot is intermittently touched by an incoming wave.
(F) Experimental conditions. Synchronous (blue, left) and Asynchronous (red, right).
(G) Embodiment questionnaire results. Blue bars show the synchronous condition and red bars show the asynchronous condition (mean ± std) (p value ∗∗p < 0.01, Wilcoxon signed-rank test).
(H) Results for the telescoping measurement. Plots show the mean and standard deviation. Blue bars show the synchronous condition and red bars show the asynchronous condition. The dashed line indicates the total length of the leg – stump (i.e., length of the phantom without telescoping). (mean ± std) (p value ∗∗p < 0.01, Wilcoxon signed-rank test).
(I) Results for the proprioceptive displacement measurement. Blue bars show the synchronous condition and red bars show the asynchronous condition. VR= virtual reality; Sync= synchronous; Async= asynchronous; Emb= embodiment; UM= unimodal; BM= bimodal.
| REAGENT or RESOURCE | SOURCE | IDENTIFIER |
|---|---|---|
| MATLAB R2016b | MathWorks | |
| R 3.5.1 | R foundation | |
| Unity | Unity Technologies | |
| Rehamove3 | Hasomed GmbH | |
| Circle Electrodes Pads (25 mm) | Tenscare | |
| HTC VIVE | VIVE | |