| Literature DB >> 29472888 |
Haizhen Luo1, Xiaoyun Wang2, Mengying Fan1, Lingyun Deng1, Chuyao Jian1, Miaoluan Wei1, Jie Luo1.
Abstract
Visual input could benefit balance control or increase postural sway, and it is far from fully understanding the effect of visual stimuli on postural stability and its underlying mechanism. In this study, the effect of different visual inputs on stability and complexity of postural control was examined by analyzing the mean velocity (MV), SD, and fuzzy approximate entropy (fApEn) of the center of pressure (COP) signal during quiet upright standing. We designed five visual exposure conditions: eyes-closed, eyes-open (EO), and three virtual reality (VR) scenes (VR1-VR3). The VR scenes were a limited field view of an optokinetic drum rotating around yaw (VR1), pitch (VR2), and roll (VR3) axes, respectively. Sixteen healthy subjects were involved in the experiment, and their COP trajectories were assessed from the force plate data. MV, SD, and fApEn of the COP in anterior-posterior (AP), medial-lateral (ML) directions were calculated. Two-way analysis of variance with repeated measures was conducted to test the statistical significance. We found that all the three parameters obtained the lowest values in the EO condition, and highest in the VR3 condition. We also found that the active neuromuscular intervention, indicated by fApEn, in response to changing the visual exposure conditions were more adaptive in AP direction, and the stability, indicated by SD, in ML direction reflected the changes of visual scenes. MV was found to capture both instability and active neuromuscular control dynamics. It seemed that the three parameters provided compensatory information about the postural control in the immersive virtual environment.Entities:
Keywords: balance control; center of pressure; entropy; head-mounted display; virtual reality
Year: 2018 PMID: 29472888 PMCID: PMC5809403 DOI: 10.3389/fneur.2018.00048
Source DB: PubMed Journal: Front Neurol ISSN: 1664-2295 Impact factor: 4.003
Figure 1(A) Schematic diagram of experimental setup. Scenes generated using computer 1 and transmitted to head-mounted display (HMD). Subjects viewed scenes inside the (HMD) while standing on a force plate. The postural output is then amplified, collected by a multichannel data acquisition system (DAQ), and the data were sent to computer 2 for subsequent analysis. (B) Screenshots of the virtual optokinetic drum scenes around three coordinate axes from top to bottom: yaw (VR1), pitch (VR2), and roll (VR3).
Figure 2An example of center of pressure (COP) signal. (A) Planar COP trajectory. (B) Anterior–posterior and (C) medial-lateral displacement of COP varies with time, respectively.
Results of two-way analysis of variance (MV, mean velocity; SD, standard deviation; fApEn, fuzzy approximate entropy).
| Parameter | |||
|---|---|---|---|
| MV | Visual condition | ||
| Direction | 1.178 | 0.286 | |
| Direction × visual condition | 1.699 | 0.193 | |
| SD | Visual condition | ||
| Direction | 0.305 | 0.585 | |
| Direction × visual condition | 2.379 | 0.055 | |
| fApEn | Visual condition | ||
| Direction | 1.768 | 0.194 | |
| Direction × visual condition | 0.993 | 0.397 | |
Bold font represents significance at p < 0.05 level.
Figure 3Comparisons of the five visual exposure conditions in terms of the three parameters in anterior–posterior (AP) and medial-lateral (ML) directions; (A,B) mean velocity (MV) in AP and ML direction, respectively; (C,D) SD in AP and ML direction, respectively; (E,F) fuzzy approximate entropy (fApEn) in AP and ML direction, respectively. The mean values of parameters are marked in squares. Asterisks denotes p < 0.05.