OBJECTIVE: To determine the accuracy of self-reported information from patients and families for use in a disease surveillance system. DESIGN: Patients and their parents presenting to the emergency department (ED) waiting room of an urban, tertiary care children's hospital were asked to use a Self-Report Tool, which consisted of a questionnaire asking questions related to the subjects' current illness. MEASUREMENTS: The sensitivity and specificity of three data sources for assigning patients to disease categories was measured: the ED chief complaint, physician diagnostic coding, and the completed Self-Report Tool. The gold standard metric for comparison was a medical record abstraction. RESULTS: A total of 936 subjects were enrolled. Compared to ED chief complaints, the Self-Report Tool was more than twice as sensitive in identifying respiratory illnesses (Rate ratio [RR]: 2.10, 95% confidence interval [CI] 1.81-2.44), and dermatological problems (RR: 2.23, 95% CI 1.56-3.17), as well as significantly more sensitive in detecting fever (RR: 1.90, 95% CI 1.67-2.17), gastrointestinal problems (RR: 1.10, 95% CI 1.00-1.20), and injuries (RR: 1.16, 95% CI 1.08-1.24). Sensitivities were also significantly higher when the Self-Report Tool performance was compared to diagnostic codes, with a sensitivity rate ratio of 4.42 (95% CI 3.45-5.68) for fever, 1.70 (95% CI 1.49-1.93) for respiratory problems, 1.15 (95% CI 1.04-1.27) for gastrointestinal problems, 2.02 (95% CI 1.42-2.87) for dermatologic problems, and 1.06 (95% CI 1.01-1.11) for injuries. CONCLUSIONS: Disease category assignment based on patient-reported information was significantly more sensitive in correctly identifying a disease category than data currently used by national and regional disease surveillance systems.
OBJECTIVE: To determine the accuracy of self-reported information from patients and families for use in a disease surveillance system. DESIGN:Patients and their parents presenting to the emergency department (ED) waiting room of an urban, tertiary care children's hospital were asked to use a Self-Report Tool, which consisted of a questionnaire asking questions related to the subjects' current illness. MEASUREMENTS: The sensitivity and specificity of three data sources for assigning patients to disease categories was measured: the ED chief complaint, physician diagnostic coding, and the completed Self-Report Tool. The gold standard metric for comparison was a medical record abstraction. RESULTS: A total of 936 subjects were enrolled. Compared to ED chief complaints, the Self-Report Tool was more than twice as sensitive in identifying respiratory illnesses (Rate ratio [RR]: 2.10, 95% confidence interval [CI] 1.81-2.44), and dermatological problems (RR: 2.23, 95% CI 1.56-3.17), as well as significantly more sensitive in detecting fever (RR: 1.90, 95% CI 1.67-2.17), gastrointestinal problems (RR: 1.10, 95% CI 1.00-1.20), and injuries (RR: 1.16, 95% CI 1.08-1.24). Sensitivities were also significantly higher when the Self-Report Tool performance was compared to diagnostic codes, with a sensitivity rate ratio of 4.42 (95% CI 3.45-5.68) for fever, 1.70 (95% CI 1.49-1.93) for respiratory problems, 1.15 (95% CI 1.04-1.27) for gastrointestinal problems, 2.02 (95% CI 1.42-2.87) for dermatologic problems, and 1.06 (95% CI 1.01-1.11) for injuries. CONCLUSIONS: Disease category assignment based on patient-reported information was significantly more sensitive in correctly identifying a disease category than data currently used by national and regional disease surveillance systems.
Authors: Wendy W Chapman; Lee M Christensen; Michael M Wagner; Peter J Haug; Oleg Ivanov; John N Dowling; Robert T Olszewski Journal: Artif Intell Med Date: 2005-01 Impact factor: 5.326
Authors: Aaron T Fleischauer; Benjamin J Silk; Mare Schumacher; Ken Komatsu; Sarah Santana; Victorio Vaz; Mitchell Wolfe; Lori Hutwagner; Joanne Cono; Ruth Berkelman; Tracee Treadwell Journal: Acad Emerg Med Date: 2004-12 Impact factor: 3.451
Authors: R E Lutner; M F Roizen; C B Stocking; R A Thisted; S Kim; P C Duke; P Pompei; C K Cassel Journal: Anesthesiology Date: 1991-09 Impact factor: 7.892
Authors: Stephen C Porter; Zhaohui Cai; William Gribbons; Donald A Goldmann; Isaac S Kohane Journal: J Am Med Inform Assoc Date: 2004-08-06 Impact factor: 4.497
Authors: Fu-Chiang Tsui; Jeremy U Espino; Virginia M Dato; Per H Gesteland; Judith Hutman; Michael M Wagner Journal: J Am Med Inform Assoc Date: 2003-06-04 Impact factor: 4.497
Authors: Faisal M Shuaib; Raegan W Durant; Gaurav Parmar; Todd M Brown; David L Roth; Martha Hovater; Jewell H Halanych; James M Shikany; George Howard; Monika M Safford Journal: J Health Care Poor Underserved Date: 2012-05
Authors: Juliet Haarbauer-Krupa; Lydie A Lebrun-Harris; Lindsey I Black; Philip Veliz; Jill Daugherty; Rebecca Desrocher; John Schulenberg; Diane Pilkey; Matthew Breiding Journal: Ann Epidemiol Date: 2020-11-21 Impact factor: 3.797
Authors: D B Herrick; A Nakhasi; B Nelson; S Rice; P A Abbott; A S Saber Tehrani; R E Rothman; H P Lehmann; D E Newman-Toker Journal: Appl Clin Inform Date: 2013-06-19 Impact factor: 2.342
Authors: Blake J Lesselroth; Kathleen Adams; Victoria L Church; Stephanie Tallett; Yelizaveta Russ; Jack Wiedrick; Christopher Forsberg; David A Dorr Journal: Appl Clin Inform Date: 2018-05-02 Impact factor: 2.342
Authors: Tanja Lucke; Ronald Herrera; Margarethe Wacker; Rolf Holle; Frank Biertz; Dennis Nowak; Rudolf M Huber; Sandra Söhler; Claus Vogelmeier; Joachim H Ficker; Harald Mückter; Rudolf A Jörres Journal: PLoS One Date: 2016-10-28 Impact factor: 3.240