Literature DB >> 19185392

Interrater reliability and accuracy of clinicians and trained research assistants performing prospective data collection in emergency department patients with potential acute coronary syndrome.

Carlos O Cruz1, Emily B Meshberg, Frances S Shofer, Christine M McCusker, Anna Marie Chang, Judd E Hollander.   

Abstract

STUDY
OBJECTIVE: Clinical research requires high-quality data collection. Data collected at the emergency department evaluation is generally considered more precise than data collected through chart abstraction but is cumbersome and time consuming. We test whether trained research assistants without a medical background can obtain clinical research data as accurately as physicians. We hypothesize that they would be at least as accurate because they would not be distracted by clinical requirements.
METHODS: We conducted a prospective comparative study of 33 trained research assistants and 39 physicians (35 residents) to assess interrater reliability with respect to guideline-recommended clinical research data. Immediately after the research assistant and clinician evaluation, the data were compared by a tiebreaker third person who forced the patient to choose one of the 2 answers as the correct one when responses were discordant. Crude percentage agreement and interrater reliability were assessed (kappa statistic).
RESULTS: One hundred forty-three patients were recruited (mean age 50.7 years; 47% female patients). Overall, the median agreement was 81% (interquartile range [IQR] 73% to 92%) and interrater reliability was fair (kappa value 0.36 [IQR 0.26 to 0.52]) but varied across categories of data: cardiac risk factors (median 86% [IQR 81% to 93%]; median 0.69 [IQR 0.62 to 0.83]), other cardiac history (median 93% [IQR 79% to 95%]; median 0.56 [IQR 0.29 to 0.77]), pain location (median 92% [IR 86% to 94%]; median 0.37 [IQR 0.25 to 0.29]), radiation (median 86% [IQR 85% to 87%]; median 0.37 [IQR 0.26 to 0.42]), quality (median 85% [IQR 75% to 94%]; median 0.29 [IQR 0.23 to 0.40]), and associated symptoms (median 74% [IQR 65% to 78%]; median 0.28 [IQR 0.20 to 0.40]). When discordant information was obtained, the research assistant was more often correct (median 64% [IQR 53% to 72%]).
CONCLUSION: The relatively fair interrater reliability observed in our study is consistent with previous studies evaluating interrater reliability for cardiovascular disease in the inpatient setting. With respect to research data, we found that prospective ascertainment of clinical data is more often correct when done by research assistants compared with clinicians simultaneously evaluating patients.

Entities:  

Mesh:

Year:  2009        PMID: 19185392     DOI: 10.1016/j.annemergmed.2008.11.023

Source DB:  PubMed          Journal:  Ann Emerg Med        ISSN: 0196-0644            Impact factor:   5.721


  5 in total

1.  Patient reported outcomes in routine care: advancing data capture for HIV cohort research.

Authors:  Michael S Kozak; Michael J Mugavero; Jiatao Ye; Inmaculada Aban; Sarah T Lawrence; Christa R Nevin; James L Raper; Cheryl McCullumsmith; Joseph E Schumacher; Heidi M Crane; Mari M Kitahata; Michael S Saag; James H Willig
Journal:  Clin Infect Dis       Date:  2011-10-31       Impact factor: 9.079

2.  Cellular technology improves transmission success of pre-hospital electrocardiograms.

Authors:  Nicholas Larochelle; Michael O'Keefe; Daniel Wolfson; Kalev Freeman
Journal:  Am J Emerg Med       Date:  2013-09-25       Impact factor: 2.469

3.  Natural Language Processing for Automated Quantification of Brain Metastases Reported in Free-Text Radiology Reports.

Authors:  Joeky T Senders; Aditya V Karhade; David J Cote; Alireza Mehrtash; Nayan Lamba; Aislyn DiRisio; Ivo S Muskens; William B Gormley; Timothy R Smith; Marike L D Broekman; Omar Arnaout
Journal:  JCO Clin Cancer Inform       Date:  2019-04

4.  Comparative prospective study of the performance of chest pain scores and clinical assessment in an emergency department cohort in Singapore.

Authors:  Mingwei Ng; Hong Jie Gabriel Tan; Fei Gao; Jack Wei Chieh Tan; Swee Han Lim; Marcus Eng Hock Ong; R Ponampalam
Journal:  J Am Coll Emerg Physicians Open       Date:  2020-09-05

5.  Reliability of medical record abstraction by non-physicians for orthopedic research.

Authors:  Michael Y Mi; Jamie E Collins; Vladislav Lerner; Elena Losina; Jeffrey N Katz
Journal:  BMC Musculoskelet Disord       Date:  2013-06-09       Impact factor: 2.362

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.