| Literature DB >> 30650159 |
Jonathan Huang1, Max Kinateder2, Matt J Dunn3, Wojciech Jarosz1, Xing-Dong Yang1, Emily A Cooper2.
Abstract
People typically rely heavily on visual information when finding their way to unfamiliar locations. For individuals with reduced vision, there are a variety of navigational tools available to assist with this task if needed. However, for wayfinding in unfamiliar indoor environments the applicability of existing tools is limited. One potential approach to assist with this task is to enhance visual information about the location and content of existing signage in the environment. With this aim, we developed a prototype software application, which runs on a consumer head-mounted augmented reality (AR) device, to assist visually impaired users with sign-reading. The sign-reading assistant identifies real-world text (e.g., signs and room numbers) on command, highlights the text location, converts it to high contrast AR lettering, and optionally reads the content aloud via text-to-speech. We assessed the usability of this application in a behavioral experiment. Participants with simulated visual impairment were asked to locate a particular office within a hallway, either with or without AR assistance (referred to as the AR group and control group, respectively). Subjective assessments indicated that participants in the AR group found the application helpful for this task, and an analysis of walking paths indicated that these participants took more direct routes compared to the control group. However, participants in the AR group also walked more slowly and took more time to complete the task than the control group. The results point to several specific future goals for usability and system performance in AR-based assistive tools.Entities:
Mesh:
Year: 2019 PMID: 30650159 PMCID: PMC6334915 DOI: 10.1371/journal.pone.0210630
Source DB: PubMed Journal: PLoS One ISSN: 1932-6203 Impact factor: 3.240
Fig 1Application screenshots.
Left column: Original view of a user looking at two signs with normal vision (top row) and simulated visual impairment via Gaussian blur (bottom row). A white circle indicates the user’s virtual cursor. Note that the signs are no longer readable with visual impairment. Middle column: Green AR signs are shown placed over the physical signs, indicating that the application identified two signs with high certainty. In real-time, these signs would be flashing at 3 Hz. Right column: Text is displayed with enhanced visibility after a user selects an AR sign.
Fig 2Subjective responses.
Bars show median and interquartile range for ratings of ease (A), comfort (B) and confidence (C) in the two groups.
Fig 3Path analysis.
(A) Lines show estimated paths taken by users in the control (blue) and AR (red) groups. The width of the hallway is illustrated for each group with two horizontal lines. Note that paths outside of the hallway suggest errors either in motion tracking or path alignment. (B, C) Bars show mean and standard error for walking speed and time to complete task. Note that walking speed excludes stationary periods.