| Literature DB >> 31245034 |
Nahal Norouzi1, Luke Bölling2, Gerd Bruder1, Greg Welch1.
Abstract
INTRODUCTION: A large body of research in the field of virtual reality is focused on making user interfaces more natural and intuitive by leveraging natural body movements to explore a virtual environment. For example, head-tracked user interfaces allow users to naturally look around a virtual space by moving their head. However, such approaches may not be appropriate for users with temporary or permanent limitations of their head movement.Entities:
Keywords: Virtual reality; augmented rotations; natural user interface
Year: 2019 PMID: 31245034 PMCID: PMC6582373 DOI: 10.1177/2055668319841309
Source DB: PubMed Journal: J Rehabil Assist Technol Eng ISSN: 2055-6683
Figure 1.(a) Illustration of the head-centered coordinate system with yaw, pitch, and roll rotations in the considered VR setup. (b) Inside view of the HTC VIVE with the integrated Pupil Labs eye tracker consisting of infrared (IR) LEDs and binocular IR cameras.
Figure 2.Illustration of the head-tracked Continuous Rotation technique in an example sequence of head movements. (a) User is facing straight ahead. (b) User rotates their heads towards the object on the left. (c) The virtual view is rotated a few degrees to the right each second. (d) After a few seconds, the view will stop rotating with object in front. The virtual environment is illustrated by the green checkerboard. Once the user’s head rotates to the left or right, the virtual view is rotated slowly in the opposite direction. If the user is facing toward a virtual object, the user will compensate for the subtle virtual rotation with physical head rotations until the object ends up directly in front of them and stops moving. The user can repeat this as often as desired. The eye-tracking based implementation uses the same approach except that the angles are taken from the eye tracker and not from the head tracker.
Figure 3.Illustration of the head-tracked Discrete Rotation technique in an example sequence of head movements. (a) User is facing straight ahead. (b) User rotates to the right beyond the threshold. (c) The virtual view is rotated by that angle. (d) User can freely look around the rotated view. The virtual environment is illustrated by the green checkerboard. Once the user’s physical head orientation crosses the threshold τ or τ, the virtual view is rotated instantaneously in the opposite direction by the angle corresponding to that threshold. The user can repeat this as often as desired to rotate the view. The eye-tracking based implementation uses the same approach except that the angles are taken from the eye tracker and not from the head tracker.