| Literature DB >> 28815151 |
Andrew J King1, Harry Hochheiser1,2, Shyam Visweswaran1,2, Gilles Clermont3, Gregory F Cooper1,2.
Abstract
Eye-tracking is a valuable research tool that is used in laboratory and limited field environments. We take steps toward developing methods that enable widespread adoption of eye-tracking and its real-time application in clinical decision support. Eye-tracking will enhance awareness and enable intelligent views, more precise alerts, and other forms of decision support in the Electronic Medical Record (EMR). We evaluated a low-cost eye-tracking device and found the device's accuracy to be non-inferior to a more expensive device. We also developed and evaluated an automatic method for mapping eye-tracking data to interface elements in the EMR (e.g., a displayed laboratory test value). Mapping was 88% accurate across the six participants in our experiment. Finally, we piloted the use of the low-cost device and the automatic mapping method to label training data for a Learning EMR (LEMR) which is a system that highlights the EMR elements a physician is predicted to use.Entities:
Year: 2017 PMID: 28815151 PMCID: PMC5543363
Source DB: PubMed Journal: AMIA Jt Summits Transl Sci Proc
Studies that utilize remote eye-tracking technology in HIT.
| Author, Year | Title | Objective | Results |
|---|---|---|---|
| Eghdam,2011[ | Combining usability testing with eye-tracking technology: Evaluation of a visualization support for antibiotic use in intensive care | Observe the visual attention and scan patterns of system users. | Navigation paths were close to expected. Eye-tracking is a useful addition to usability studies. |
| Forsman, 2013[ | Integrated information visualization to support decision making for use of antibiotics in intensive care: Design and usability evaluation | Evaluate a prototype visualization tool that aids decision making in antibiotic use in the intensive care unit (ICU). | Visual attention when completing the tasks differs between specialists and residents, who focus on the tables and on exploring the graphical user interface, respectively. |
| Nielson,
2013[ | In-situ eye-tracking of emergency physician result review | Determine the time spent by physicians looking at lab results and fixating on specific values in a live clinical setting. | Average time viewing lab results was 13.9 seconds, with an average fixation length of 9.9 seconds. |
| Barkana,
2014[ | Improvement of design of a surgical interface using an eye- tracking device | Evaluate a proposed surgical interface in terms of gaze fixations. | Fixation counts showed that displaying 8 CT scans for one patient was redundant, so they reduced the number to 2. This reduced time to task completion. |
| Doberne, 2015[ | Using high-fidelity simulation and eye-tracking to characterize EHR workflow patterns among hospital physicians | Characterize typical EMR usage by hospital physicians as they encounter a new patient. | Found two different information gathering and documentation workflows among participants. |
| Gold,
2015[ | Feasibility of utilizing a commercial eye tracker to assess electronic health record use during patient simulation | Understand factors associated with poor error recognition during an ICU based EMR simulation. | Improved performance was associated with a pattern of rapid scanning of data manifested by increased number of screens visited, mouse clicks, and saccades. |
| Moacdieh,
2015[ | Clutter in electronic medical records: Examining its performance and attentional costs using eye-tracking | Assess the effects of clutter, in combination with stress and task difficulty, on visual search and noticing performance. | Clutter degraded performance in terms of response time and case awareness, especially for high stress and difficult tasks. |
| Rick, 2015[ | Eyes on the clinic: Accelerating meaningful interface analysis through unobtrusive eye- tracking | Observe and report physician experiences using their EMRs. | Physician time was predominated by searching behavior indicating that the organization of the EMR system was not conducive to physician workflow. |
Figure 1.Overlay of interface elements and eye gaze data.
Average errors of two eye-tracking devices. Each error cell is the average of absolute median errors across fifty gaze points for each participant.
| 1 | 8 | 9 | 21 | 10 | 23 | 15 |
| 2 | 13 | 17 | 32 | 17 | 36 | 27 |
| 3 | 9 | 10 | 16 | 17 | 19 | 22 |
| 4 | 10 | 21 | 21 | 19 | 24 | 30 |
| 5 | 16 | 15 | 22 | 12 | 30 | 21 |
| 6 | 5 | 10 | 14 | 11 | 16 | 16 |
| 7 | 9 | 14 | 12 | 14 | 16 | 22 |
| 8 | 8 | 14 | 16 | 22 | 19 | 29 |
| 9 | 11 | 16 | 14 | 21 | 20 | 28 |
| Difference (95% CI) | -4 (-6.8, -1.4) | 2.9 (-3.2, 8.7) | -0.7 (-6.6, 5.1) | |||
Figure 2.Difference in error of two eye-tracking devices (EyeX minus X2-30). Error bars indicate two-sided 95% confidence intervals. The shaded area indicates error values below the non-inferiority margin (11 pixels). Since, the upper limit of each error bar is below the non-inferiority margin, the data support that the EyeX device is not inferior.
Performance of the eye-tracking system across six participants.
| 1 | 1 | 3 | 6 | 0.50 |
| 2 | 1 | 4 | 6 | 0.67 |
| 3 | 1 | 5 | 6 | 0.83 |
| 4 | 2 | 9 | 12 | 0.75 |
| 5 | 1 | 6 | 6 | 1.00 |
| 6 | 1 | 6 | 6 | 1.00 |
| 7 | 1 | 6 | 6 | 1.00 |
| 8 | 2 | 11 | 12 | 0.92 |
| 9 | 1 | 6 | 6 | 1.00 |
| 10 | 1 | 6 | 6 | 1.00 |
| 11 | 1 | 6 | 6 | 1.00 |
| 12 | 2 | 11 | 12 | 0.92 |
| 79 | 90 | 0.88 | ||
Averages across all ten cases of each mapping method tested.
| DGP | 1 | 0.73 | 0.82 | 0.76 | 0.86 | 0.57 | 0.57 | 0.58 | 0.78 | 0.66 | 0.76 | |
| GP | 1 | 0.72 | 0.82 | 0.74 | 0.85 | 0.56 | 0.58 | 0.57 | 0.77 | 0.65 | 0.75 | |
| I-AOI | 2 | 0.66 | 0.78 | 0.72 | 0.83 | 0.55 | 0.55 | 0.54 | 0.76 | 0.62 | 0.73 | |
| I-DT | 3 | 80 | 0.50 | 0.64 | 0.52 | 0.67 | 0.55 | 0.54 | 0.55 | 0.74 | 0.53 | 0.65 |