| Literature DB >> 35302441 |
Kathryn Shaw-Saliba1, Bhakti Hansoti1, Howard Burkom2, Diego A Martinez1,3, Anna DuVal1, Brian Lee4,5, Phong Chau6, Breanna McBride1, Yu-Hsiang Hsieh1, Vidiya Sathananthan7, David Persing6, Michael Turnlund6, Roxanne Shively8, Andrea Dugas1, Richard E Rothman1.
Abstract
INTRODUCTION: Electronic influenza surveillance systems aid in health surveillance and clinical decision-making within the emergency department (ED). While major advances have been made in integrating clinical decision-making tools within the electronic health record (EHR), tools for sharing surveillance data are often piecemeal, with the need for data downloads and manual uploads to shared servers, delaying time from data acquisition to end-user. Real-time surveillance can help both clinicians and public health professionals recognize circulating influenza earlier in the season and provide ongoing situational awareness.Entities:
Mesh:
Year: 2022 PMID: 35302441 PMCID: PMC8967469 DOI: 10.5811/westjem.2021.9.52741
Source DB: PubMed Journal: West J Emerg Med ISSN: 1936-900X
Comparison of demographics, EMR, testing strategy and reporting by study sites.
| Facility | Annual volume | Electronic health record | Influenza testing strategy pre-implementation | Data reporting pre-implementation |
|---|---|---|---|---|
| Site 1 | 66,000 | EPIC | Physician gestalt, ie, testing based on the clinical determination of the treating physician (resident or attending). | Weekly reporting to Maryland DOH |
| Site 2 | 65,000 | ORCHID | Weekly reporting to LAC DPH | |
| Site 3 | 60,000 | EPIC | Weekly report AZ DHS | |
| Site 4 | 62,000 | Cerner | Weekly reporting to KDHEKS |
DOH, Department of Health; LAC DPH, Los Angeles County Department of Public Health; AZ DHS, Arizona Department of Health Services; KDHEKS, Kansas Department of Health & Environment.
Figure 1Preparatory steps for surveillance implementation.
Figure 2Data flow for cloud-based data aggregation system.
Figure 3Schematic of architecture and data flow for the operational pilot surveillance system (RemoteXpert).
Figure 4Rendering of the key data elements available on the RemoteXpert dashboard.
Cumulative data from the pilot implementation of the integrated influenza surveillance system.
| All sites | JHU | UCLA | MMC | TMC | |
|---|---|---|---|---|---|
| Patients through department | 126,539 | 33,500 | 42,091 | 24,681 | 26,267 |
| Patients assessed by CDG N(%) | 118,916 (94%) | 30,516 (91%) | 38,741 (92%) | 23,603 (96%) | 26,056 (99%) |
| Patient who met criteria N(%) | 6,955 (6%) | 2,079 (7%) | 2,582 (7%) | 1,368 (6%) | 926 (4%) |
| Xpert Flu tests ordered N(%) | 6,601 (95%) | 2,000 (96%) | 2,362 (91%) | 1,313 (96%) | 926 (100%) |
| Specimens collected N(%) | 5,939 (90%) | 1,710 (86%) | 2,019 (85%) | 1,284 (98%) | 926 (100%) |
| Tests resulted N(%) | 5,937 (100%) | 1,710 (100%) | 2,017 (100%) | 1,284 (100%) | 926 (100%) |
| Test results appearing in EHR N(%) | 5,937 (100%) | 1,710 (100%) | 2,017 (100%) | 1,284 (100%) | 926 (100%) |
| Patients positive for influenza N(%) | 1,070 (18%) | 323 (19%) | 367 (18%) | 202 (16%) | 178 (19%) |
JHU, Johns Hopkins University; UCLA, University of California, Los Angeles; MMC, Madras Medical College, TMC, Texas Medical center; CDG, clinical decision guideline; EHR, electronic health record.
Management of interoperability issues based on guiding principles of the Office of the National Coordinator for Health Information Technology.
| ONC’s essential services for interoperability | Project-specific task requirement | Issue | Management of issue |
|---|---|---|---|
| Accurately match individuals, providers, and their information across data sources. | Bi-directional data transfer to combine influenza test result, time, the location from GeneXpert with demographics | Required demographic details were stored in different systems (eg, LIS, EHR) | Data were manually entered by research coordinators into a laptop containing the test result data, combined, and uploaded to a cloud-based interface by means of Cepheid software. |
| Directories of the technical and human-readable endpoints for data sources, so they and the respective data are discoverable. | Provide access to test results from cloud-based systems to distributed user locations | No natural interface because of differences among users’ proprietary systems. | Users accessed the system through a password-protected Cepheid site, where they could also designate upload destinations for further analysis. |
| Authorizing users to access data from the data sources | Provide end-users with a hierarchy of access privileges | None | A logical hierarchy of access privileges. |
| Authenticating users when they want to access data from data sources | Control access with individual user accounts | A mass invitation provided by Cepheid was rejected by some site firewalls | Access control and definitions were handled via the CanCan Ruby library, which provided a Declarative Authentication DSL for specifying model permissions and enforcing them at the controller level. |
| Securing the data when it is stored or maintained in the data sources and in transit, ie, when it moves between source and user | De-identified data must remain secure during transmission between local testing site and cloud-based system. | None | Test results entering the Cepheid cloud portal via a secured transport layer security channel were processed by a test results processor service to generate non-sensitive aggregations that the cloud software could leverage for analysis, visualization, technical support, and other administrative functions. |
| Representing data at a granular level to enable reuse | Transmit to CDC database using HL7 code and be accessible to other end-users via dashboards and comma-delimited files | Visualization and detailed data needs varied among users. | Dashboard designs for visualization were adapted from previous Cepheid systems and made available through a secure website in the Cepheid cloud. Designs were adapted to the needs of influenza surveillance users; for example, a “medical dashboard” allows inspection of data aggregated by location and laboratory. |
| Handling information from varied information sources in both structured and unstructured formats | The cloud-based system had to receive both test results and demographic data in available formats. | Demographic data were represented in different formats across user sites. | The Cepheid software for uploading and managing data accepted only structured data records; the correct structure from GeneXpert test results was automatic. However, site coordinators had to ensure formatting and completeness of demographic details merged from the LIS systems. |
ONC, Office of the National Coordinator for Health Information Technology; LIS, laboratory information system software; EHR, electronic health record; DSL, domain-specific language.