| Literature DB >> 25387758 |
Shailaja Menon1, Michael W Smith1, Dean F Sittig1, Nancy J Petersen2, Sylvia J Hysong1, Donna Espadas1, Varsha Modi1, Hardeep Singh1.
Abstract
OBJECTIVES: Electronic health record (EHR)-based alerts can facilitate transmission of test results to healthcare providers, helping ensure timely and appropriate follow-up. However, failure to follow-up on abnormal test results (missed test results) persists in EHR-enabled healthcare settings. We aimed to identify contextual factors associated with facility-level variation in missed test results within the Veterans Affairs (VA) health system. DESIGN, SETTING AND PARTICIPANTS: Based on a previous survey, we categorised VA facilities according to primary care providers' (PCPs') perceptions of low (n=20) versus high (n=20) risk of missed test results. We interviewed facility representatives to collect data on several contextual factors derived from a sociotechnical conceptual model of safe and effective EHR use. We compared these factors between facilities categorised as low and high perceived risk, adjusting for structural characteristics.Entities:
Keywords: Diagnostic Errors; Elecronic Health Records; Missed Test Results; Patient Follow-up; Patient Safety; Social-Technical Model
Mesh:
Year: 2014 PMID: 25387758 PMCID: PMC4244393 DOI: 10.1136/bmjopen-2014-005985
Source DB: PubMed Journal: BMJ Open ISSN: 2044-6055 Impact factor: 2.692
Figure 1Eight-dimensional sociotechnical model of safe and effective electronic health record use.
Interview questions
| Sociotechnical dimension | Interview questions | Rationale |
|---|---|---|
| 1. Hardware and software |
Does your site have any modified software that impacts alert management? Do you generate any reports to monitor the changes made to the software? | The EHR allows facilities to make changes to the software to address local needs, which can affect how alerts are managed |
| 2. Clinical content |
The number of mandatory, enabled and disabled alerts? Are there any national or network level mandatory alerts? | The notification management options within CPRS can be used to turn specific notification on or off. Alerts can be enabled, disabled or set as mandatory. Some alerts are mandated centrally. The number of enabled and mandatory alerts can affect alert volume |
| 3. User interface |
In a typical workday, how many providers request support? How often do you get calls from providers about missed or lost alerts? | A poorly designed user interface can lead to difficulties in managing alerts, prompting the providers to seek support to manage alerts |
| 4. People |
How much time is spent on View Alert training? Does the site have specific training on View Alerts? | CPRS uses the ‘View Alert’ notification system to inform clinicians about critical test results. Providers should have necessary training to process view alerts |
| 5. Organisational policies |
Does your facility have a policy on test result communication? Do you have an EHR committee for oversight? | Having a test result communication policy is important to ensure that there is no ambiguity regarding acknowledgement and follow-up of alerts |
| 6. State and federal rules |
Are you aware of the VHA 2009 directive for communication of test results? | The VHA 2009-019 directives mandates that patients should be notified about all test results within 14 days |
| 7. Workflow and communication |
Do you have any mechanisms to prevent alerts from falling through the cracks/alerts being missed? Do you have a case manager that gets notified about certain abnormal results? Are alerts set to go to a team rather than specific provider? | Facilities should have mechanisms in place to make sure that critical alerts are not missed/lost. Back-up procedures to prevent alerts falling through cracks should be implemented |
| 8. Monitoring |
What monitoring practices do you have in place for follow-up of critical/abnormal diagnostic test results? Is acknowledgement and follow-up of alerts monitored at your facility? | In order to keep track of critical test result follow-up, good monitoring practices should be in place |
CPRS, computerised patient record system; EHR, electronic health record; HIT, health information technology; VHA, Veterans Healthcare Administration; VISN, Veterans Integrated Service Network.
Comparison of low and high perceived risk facilities on sociotechnical variables
| Sociotechnical variables | High perceived risk facilities | Low perceived risk facilities | Total | p Value |
|---|---|---|---|---|
| 1. Does the site have modified software? | ||||
| No | 20 (100.0) | 17 (85.0) | 37 (92.5) | 0.2308 |
| Yes | 0 (0.0) | 3 (15.0) | 3 (7.5) | |
| 2. Number of enabled alerts* | ||||
| 10 or less | 5 (27.8) | 6 (35.3) | 11 (31.4) | 0.6550 |
| 11 or more | 13 (72.2) | 11 (64.7) | 24 (68.6) | |
| 3. Number of mandatory alerts* | ||||
| 10 or less | 9 (45.0) | 5 (26.3) | 14 (35.9) | 0.4323 |
| 11 or more | 11 (55.0) | 14 (73.7) | 25 (64.1) | |
| 4. VISN level mandatory alerts | ||||
| No | 17 (85.0) | 18 (90.0) | 35 (87.5) | 1.00 |
| Yes | 3 (15.0) | 2 (10.0) | 5 (12.5) | |
| 5. How long alerts stay in the alert window?* | ||||
| >14 days | 15 (78.9) | 16 (88.9) | 31 (88.8) | 0.6599 |
| 14 days or less | 4 (21.1) | 2 (11.1) | 6 (16.2) | |
| 6. How often do you get calls from providers about missed/lost alerts* | ||||
| Every few months | 8 (40.0) | 10 (50.0) | 29 (72.5) | 0.8341 |
| At least once a month or more | 12 (60.0) | 10 (50.0) | 11 (27.5) | |
| 7. Time on EHR training* | ||||
| 2 h or less | 4 (21.1) | 1 (5.6) | 5 (13.5) | 0.3398 |
| More than 2 h | 15 (78.9) | 17 (94.4) | 32 (86.4) | |
| 8. Does the site have specific training on View Alerts? | ||||
| No | 11 (55.0) | 11 (55.0) | 22 (55.0) | 1.00 |
| Yes | 9 (45.0) | 9 (45.0) | 18 (45.0) | |
| 9. Time spent on View Alert training* | ||||
| 10 min or less | 5 (38.5) | 8 (50.0) | 13 (44.8) | 0.7107 |
| More than 10 min | 8 (61.5) | 8 (50.0) | 16 (55.2) | |
| 10. Does the site utilise super users | ||||
| No | 16 (80.0) | 15 (75.0) | 31 (77.5) | 1.00 |
| Yes | 4 (20.0) | 5 (25.0) | 9 (22.5) | |
| 11. Action taken for unacknowledged alerts* | ||||
| No action/do not know | 3 (15.0) | 1 (5.0) | 4 (10.0) | 0.605 |
| Some action taken | 17 (85.0) | 19 (95.0) | 36 (90.0) | |
| 12. Alerts go to team rather than providers | ||||
| No | 13 (65.0) | 12 (30.0) | 25 (60.0) | 1.00 |
| Yes | 7 (35.0) | 8 (20.0) | 15 (40.0) | |
| 13. Mandatory acknowledgment and follow-up of alerts | ||||
| No | 5 (25.0) | 9 (45.0) | 14 (35.0) | 0.3203 |
| Yes | 15 (75.0) | 11 (55.0) | 26 (65.0) | |
| 14. Programming, implementation and impact tracked? | ||||
| No | 14 (70.0) | 13 (65.0) | 27 (67.5) | 1.00 |
| Yes | 6 (30.0) | 7 (35.0) | 13 (32.5) | |
| 15. Monitoring practices for follow-up of critical tests? | ||||
| No | 4 (20.0) | 6 (30.0) | 10 (25.0) | 0.7164 |
| Yes | 16 (80.0) | 14 (70.0) | 30 (75.0) | |
| 16. Does test result alert policy address alert management? | ||||
| No | 6 (30.0) | 3 (15.0) | 9 (22.5) | 0.4506 |
| Yes | 14 (70.0) | 17 (85.0) | 31 (77.5) | |
| 17. Does the facility have an EHR committee? | ||||
| No | 11 (55.0) | 9 (45.0) | 20 (50.0) | 0.7524 |
| Yes | 9 (45.0) | 11 (55.0) | 20 (50.0) | |
| 18. Does your facility have a case manager who is notified of abnormal test? | ||||
| No | 15 (75.0) | 14 (70.0) | 29 (72.5) | 1.00 |
| Yes | 5 (25.0) | 6 (30.0) | 11 (27.5) | |
| 19. Mechanism to prevent alerts falling through the cracks | ||||
| No | 10 (50.0) | 2 (10.0) | 12 (30.0) | 0.0138 |
| Yes | 10 (50.0) | 18 (90.0) | 28 (70.0) | |
| 20. Awareness of VHA directive 2009-019? | ||||
| No | 4 (20.0) | 2 (10.0) | 6 (15.0) | 0.6614 |
| Yes | 16 (80.0) | 18 (90.0) | 34 (85.0) | |
*Categorisation of these variables was based on examination of the empirical distribution and clinical judgement of the research team regarding appropriate cut points. Wilcoxon rank-sum test did not reveal any differences between the high and low vulnerability facilities for these five variables, which we analysed both as continuous and as categorical. For ease of presentation, we have reported χ2 test statistics.
CPRS, computerised patient record system; EHR, electronic health record; HIT, health information technology; VHA, Veterans Healthcare Administration; VISN, Veterans Integrated Service Network.