| Literature DB >> 29653922 |
Mujeeb A Basit1, Krystal L Baldwin1, Vaishnavi Kannan1, Emily L Flahaven1, Cassandra J Parks1, Jason M Ott1, Duwayne L Willett1.
Abstract
BACKGROUND: Moving to electronic health records (EHRs) confers substantial benefits but risks unintended consequences. Modern EHRs consist of complex software code with extensive local configurability options, which can introduce defects. Defects in clinical decision support (CDS) tools are surprisingly common. Feasible approaches to prevent and detect defects in EHR configuration, including CDS tools, are needed. In complex software systems, use of test-driven development and automated regression testing promotes reliability. Test-driven development encourages modular, testable design and expanding regression test coverage. Automated regression test suites improve software quality, providing a "safety net" for future software modifications. Each automated acceptance test serves multiple purposes, as requirements (prior to build), acceptance testing (on completion of build), regression testing (once live), and "living" design documentation. Rapid-cycle development or "agile" methods are being successfully applied to CDS development. The agile practice of automated test-driven development is not widely adopted, perhaps because most EHR software code is vendor-developed. However, key CDS advisory configuration design decisions and rules stored in the EHR may prove amenable to automated testing as "executable requirements."Entities:
Keywords: agile methods; clinical decision support systems; electronic health records; software validation; software verification; test driven development
Year: 2018 PMID: 29653922 PMCID: PMC5924365 DOI: 10.2196/medinform.9679
Source DB: PubMed Journal: JMIR Med Inform
Figure 1Test–driven development cycle.
Configuration of FitNesse and dbFit: time and personnel requirements. EHR: electronic health record; IT: information technology; SQL: structured query language.
| Task category | Task | Frequency | Time (range) | Type of personnel |
| Initial set-up of FitNesse + dbFit testing framework | Download and install FitNesse to point of functioning FitNesse wiki | Once | 30 minutes | IT analyst |
| Configure FitNesse to use Active Directory login permissions (if desired) | Once | 2 hours to 1 day | IT analyst knowledgeable about one’s local Active Directory | |
| Configure dbFit | Once | Few minutes to 2 hours | IT analyst | |
| Set up database connection for FitNesse/dbFit to query an EHR (or other) database | Once per database | 1 hour (if first time doing); a few minutes per connection once experienced | IT analyst | |
| Create a test “template” for a given type of test | Write SQL to serve as template for given type of test | Once per new type of test | 1 to 2 hours | EHR analyst; SQL writer (can be same person) |
| Configure an individual test instance | Create Microsoft Excel copy of test template and populate for given test instance, ready for vetting with clinician or other customer | Once per test instance | 15 to 60 minutes | EHR analyst |
| Import Microsoft Excel test to FitNesse Test page, and test | Once per test instance | 10 to 15 minutes | EHR analyst or test team analyst |
Figure 2Screenshot of FitNesse test specifying Department Specialty and Provider Type restrictions. n/a: not applicable.
Figure 3Screenshot of test specifying triggering action for this advisory.
Figure 4Decision tree for the advisory. NIH: National Institutes of Health.
Figure 5Screenshot of test specifying clinical decision support rule logic. CINN: Cincinnati; NIH: National Institutes of Health.
Figure 6Screenshot of test specifying user interface actions for the advisory. BPA: Best Practice Advisory
Figure 7Screenshot of test specifying system actions following clinician response. BPA: Best Practice Advisory; PO: per os.
Figure 8Screenshot of acceptance test: all assertions fail as expected prior to build.
Figure 9Screenshot of a test table included in the acceptance test suite: acceptance test partially passes following initial build. GCS: Glasgow Coma Scale; PO: per os.
Figure 10Screenshot of acceptance test assertions for "base" alert record, all passing following successful build. BPA: Best Practice Advisory; PO: per os.
Test suite: number of tests and individual assertions, with execution times. NIH: National Institutes of Health.
| Type | Test page name | Tests | Assertions | Time (s) |
| Base | Alert Stroke Suspected But No Swallow Screen | 8 | 48 | 0.003 |
| Criteria | Criteria Abnormal Cincinnati Stroke Scale | 3 | 14 | 0.002 |
| Criteria | Criteria NIH Stroke Scale Ordered | 3 | 19 | 0.001 |
| Criteria | Criteria Code Stroke Ordered | 4 | 24 | 0.002 |
| Criteria | Criteria Med With Oral Route | 3 | 19 | 0.002 |
| Criteria | Criteria Stroke Dysphagia Screen Performed | 3 | 14 | 0.002 |
| Suite | Suite Story Stroke Swallow Screen | 24 | 138 | 0.869 |