Literature DB >> 16271676

Classification of emergency department chief complaints into 7 syndromes: a retrospective analysis of 527,228 patients.

Wendy W Chapman1, John N Dowling, Michael M Wagner.   

Abstract

STUDY
OBJECTIVE: Electronic surveillance systems often monitor triage chief complaints in hopes of detecting an outbreak earlier than can be accomplished with traditional reporting methods. We measured the accuracy of a Bayesian chief complaint classifier called CoCo that assigns patients 1 of 7 syndromic categories (respiratory, botulinic, gastrointestinal, neurologic, rash, constitutional, or hemorrhagic) based on free-text triage chief complaints.
METHODS: We compared CoCo's classifications with criterion syndromic classification based on International Classification of Diseases, Ninth Revision (ICD-9) discharge diagnoses. We assigned the criterion classification to a patient based on whether the patient's primary diagnosis was a member of a set of ICD-9 codes associated with CoCo's 7 syndromes. We tested CoCo's performance on a set of 527,228 chief complaints from patients registered at the University of Pittsburgh Medical Center emergency department (ED) between 1990 and 2003. We performed a sensitivity analysis by varying the ICD-9 codes in the criterion standard. We also tested CoCo on chief complaints from EDs in a second location (Utah).
RESULTS: Approximately 16% (85,569/527,228) of the patients were classified according to the criterion standard into 1 of the 7 syndromes. CoCo's classification performance (number of cases by criterion standard, sensitivity [95% confidence interval (CI)], and specificity [95% CI]) was respiratory (34,916, 63.1 [62.6 to 63.6], 94.3 [94.3 to 94.4]); botulinic (1,961, 30.1 [28.2 to 32.2], 99.3 [99.3 to 99.3]); gastrointestinal (20,431, 69.0 [68.4 to 69.6], 95.6 [95.6 to 95.7]); neurologic (7,393, 67.6 [66.6 to 68.7], 92.7 [92.6 to 92.8]); rash (2,232, 46.8 [44.8 to 48.9], 99.3 [99.3 to 99.3]); constitutional (10,603, 45.8 [44.9 to 46.8], 96.6 [96.6 to 96.7]); and hemorrhagic (8,033, 75.2 [74.3 to 76.2], 98.5 [98.4 to 98.5]). The sensitivity analysis showed that the results were not affected by the choice of ICD-9 codes in the criterion standard. Classification accuracy did not differ on chief complaints from the second location.
CONCLUSION: Our results suggest that, for most syndromes, our chief complaint classification system can identify about half of the patients with relevant syndromic presentations, with specificities higher than 90% and positive predictive values ranging from 12% to 44%.

Entities:  

Mesh:

Year:  2005        PMID: 16271676     DOI: 10.1016/j.annemergmed.2005.04.012

Source DB:  PubMed          Journal:  Ann Emerg Med        ISSN: 0196-0644            Impact factor:   5.721


  27 in total

1.  Supporting patient care in the emergency department with a computerized whiteboard system.

Authors:  Dominik Aronsky; Ian Jones; Kevin Lanaghan; Corey M Slovis
Journal:  J Am Med Inform Assoc       Date:  2007-12-20       Impact factor: 4.497

2.  Optimizing A syndromic surveillance text classifier for influenza-like illness: Does document source matter?

Authors:  Brett R South; Brett Ray South; Wendy W Chapman; Wendy Chapman; Sylvain Delisle; Shuying Shen; Ericka Kalp; Trish Perl; Matthew H Samore; Adi V Gundlapalli
Journal:  AMIA Annu Symp Proc       Date:  2008-11-06

3.  Detecting Suicide-Related Emergency Department Visits Among Adults Using the District of Columbia Syndromic Surveillance System.

Authors:  S Janet Kuramoto-Crawford; Erica L Spies; John Davies-Cole
Journal:  Public Health Rep       Date:  2017 Jul/Aug       Impact factor: 2.792

4.  Generating a reliable reference standard set for syndromic case classification.

Authors:  Wendy W Chapman; John N Dowling; Michael M Wagner
Journal:  J Am Med Inform Assoc       Date:  2005-07-27       Impact factor: 4.497

5.  Consensus Development of a Modern Ontology of Emergency Department Presenting Problems-The Hierarchical Presenting Problem Ontology (HaPPy).

Authors:  Steven Horng; Nathaniel R Greenbaum; Larry A Nathanson; James C McClay; Foster R Goss; Jeffrey A Nielson
Journal:  Appl Clin Inform       Date:  2019-06-12       Impact factor: 2.342

6.  Comparison of machine learning classifiers for influenza detection from emergency department free-text reports.

Authors:  Arturo López Pineda; Ye Ye; Shyam Visweswaran; Gregory F Cooper; Michael M Wagner; Fuchiang Rich Tsui
Journal:  J Biomed Inform       Date:  2015-09-16       Impact factor: 6.317

7.  The value of patient self-report for disease surveillance.

Authors:  Florence T Bourgeois; Stephen C Porter; Clarissa Valim; Tiffany Jackson; E Francis Cook; Kenneth D Mandl
Journal:  J Am Med Inform Assoc       Date:  2007-08-21       Impact factor: 4.497

8.  Emergency department chief complaint and diagnosis data to detect influenza-like illness with an electronic medical record.

Authors:  Larissa S May; Beth Ann Griffin; Nicole Maier Bauers; Arvind Jain; Marsha Mitchum; Neal Sikka; Marianne Carim; Michael A Stoto
Journal:  West J Emerg Med       Date:  2010-02

9.  Summary of data reported to CDC's national automated biosurveillance system, 2008.

Authors:  Jerome I Tokars; Roseanne English; Paul McMurray; Barry Rhodes
Journal:  BMC Med Inform Decis Mak       Date:  2010-05-25       Impact factor: 2.796

10.  Under-documentation of chronic kidney disease in the electronic health record in outpatients.

Authors:  Herbert S Chase; Jai Radhakrishnan; Shayan Shirazian; Maya K Rao; David K Vawdrey
Journal:  J Am Med Inform Assoc       Date:  2010 Sep-Oct       Impact factor: 4.497

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.