| Literature DB >> 23612350 |
Michael Juntao Yuan1, George Mike Finley, Ju Long, Christy Mills, Ron Kim Johnson.
Abstract
BACKGROUND: Clinical decision support systems (CDSS) are important tools to improve health care outcomes and reduce preventable medical adverse events. However, the effectiveness and success of CDSS depend on their implementation context and usability in complex health care settings. As a result, usability design and validation, especially in real world clinical settings, are crucial aspects of successful CDSS implementations.Entities:
Keywords: clinical decision support systems; heuristic evaluations; human computer interaction; patient-centered care; software design; software performance; usability testing; user-computer interface
Year: 2013 PMID: 23612350 PMCID: PMC3628119 DOI: 10.2196/ijmr.2402
Source DB: PubMed Journal: Interact J Med Res ISSN: 1929-073X
Figure 1The main split screen user interface of the decision support system.
Figure 2The action items for the nurse after a likely outcome is reached.
Figure 3The clinical rule editor in the Web-based CMS.
Example heuristic violations.
| Heuristics violated | Place of occurrence | Severity | Usability problem description |
| Visibility of system status | Start | 3.8 | When syncing the application, there was no way to know if it will take 15 seconds or 10 minutes. It would be nice to know that it will take approximately 1 minute or show a percent completion. |
| Match between system and the real world | Outcome | 3.4 | List the outcomes as percentages instead of just a number without percentages. |
| User control and freedom | Checklist | 4 | The user should have the ability to change an answer once it has gone down to the list of answered questions. I can see frustration with the process if you have to completely start over to change an answer. |
| Consistency and standards | Outcome | 1 | Color code should be far apart along the visible spectrum so that the outcome can be clearly distinguished. |
| Error prevention | Checklist | 4 | Have the user confirmation when backing out of a screen that would cause the user to have to reenter all data. |
| Recognition rather than recall | Checklist | 2 | Abbreviations are used in the checklist. It should follow a simple primary rule. |
| Flexibility and efficiency of use | Checklist | 3 | If we add future triggers, there needs to be a way to ensure that when the keyboard displays that it does not cover the last triggers. Currently it is not a problem but should build this into system now. |
| Aesthetic and minimalist design | Outcome | 3 | There were too many "start over" displays currently. It would be simpler to have 1 button with a drop down screen listing the options: trigger, patient, or user. The questions also need to be reviewed by Dr. Finley and the RRT as currently there are a few questions that ask the same thing, but are just worded differently, and duplicating the questions is unnecessary. |
| Help user Recognize, diagnose, and recover from errors | Start | 4 | When a user accidentally hit the home button on iPad, the system will close without any warning and all data will be lost. Restarting within 1 minute allows you to get back to where you were. Otherwise the program will close. |
| Documentation and help | Outcome | 3 | The outcomes are in different colors. I am not sure that the staff will know what the color-coding means. Define the color scheme. |
Number of the heuristic violations across the heuristics.
| Heuristics violated | Count of usability problem description |
| Aesthetic and minimalist design | 4 |
| Consistency and standards | 10 |
| Documentation and Help | 13 |
| Error prevention | 6 |
| Flexibility and efficiency of use | 4 |
| Help user recognize, diagnose, and recover from errors | 12 |
| Match between system and the real world | 10 |
| Recognition rather than recall | 4 |
| User control and freedom | 8 |
| Visibility of system status | 12 |
| Grand total | 83 |
Places of the heuristic violations occurrence.
| Places of occurrence | Count of heuristics violated |
| Action | 13 |
| Checklist | 33 |
| Outcome | 13 |
| Start | 24 |
| Grand total | 83 |
The task burdens measured by the NASA Task Load Index.
| Task burden | Average out of 100 (SD) |
| Mental demand | 10.0 (7.4) |
| Physical demand | 1.8 (2.1) |
| Temporal demand | 20.4 (24.8) |
| Performance | 10.7 (11.3) |
| Effort | 4.5 (4.9) |
| Frustration | 1.6 (2.5) |
Severity of the heuristic violations.
| Heuristics violated | Average of severity |
| Aesthetic and minimalist design | 2.25 |
| Consistency and standards | 1.49 |
| Documentation and Help | 3.01 |
| Error prevention | 3.88 |
| Flexibility and efficiency of use | 2.88 |
| Help user recognize, diagnose, and recover from errors | 2.48 |
| Match between system and the real world | 2.50 |
| Recognition rather than recall | 2.20 |
| User control and freedom | 3.13 |
| Visibility of system status | 2.93 |