Seong Hoon Lee1, Kah Long Aw1, Ferghal McVerry1, Mark O McCarron1. 1. School of Medicine, Dentistry and Biomedical Sciences (SHL, KLA), Queen's University Belfast, Belfast; and Department of Neurology (FM, MOM), Altnagelvin Hospital, Derry, United Kingdom.
Abstract
OBJECTIVE: To determine the interrater variability for TIA diagnostic agreement among expert clinicians (neurologists/stroke physicians), administrative data, and nonspecialists. METHODS: We performed a meta-analysis of studies from January 1984 to January 2019 using MEDLINE, EMBASE, and PubMed. Two reviewers independently screened for eligible studies and extracted interrater variability measurements using Cohen's kappa scores to assess diagnostic agreement. RESULTS: Nineteen original studies consisting of 19,421 patients were included. Expert clinicians demonstrate good agreement for TIA diagnosis (κ = 0.71, 95% confidence interval [CI] = 0.62-0.81). Interrater variability between clinicians' TIA diagnosis and administrative data also demonstrated good agreement (κ = 0.68, 95% CI = 0.62-0.74). There was moderate agreement (κ = 0.41, 95% CI = 0.22-0.61) between referring clinicians and clinicians at TIA clinics receiving the referrals. Sixty percent of 748 patient referrals to TIA clinics were TIA mimics. CONCLUSIONS: Overall agreement between expert clinicians was good for TIA diagnosis, although variation still existed for a sizeable proportion of cases. Diagnostic agreement for TIA decreased among nonspecialists. The substantial number of patients being referred to TIA clinics with other (often neurologic) diagnoses was large, suggesting that clinicians, who are proficient in managing TIAs and their mimics, should run TIA clinics.
OBJECTIVE: To determine the interrater variability for TIA diagnostic agreement among expert clinicians (neurologists/stroke physicians), administrative data, and nonspecialists. METHODS: We performed a meta-analysis of studies from January 1984 to January 2019 using MEDLINE, EMBASE, and PubMed. Two reviewers independently screened for eligible studies and extracted interrater variability measurements using Cohen's kappa scores to assess diagnostic agreement. RESULTS: Nineteen original studies consisting of 19,421 patients were included. Expert clinicians demonstrate good agreement for TIA diagnosis (κ = 0.71, 95% confidence interval [CI] = 0.62-0.81). Interrater variability between clinicians' TIA diagnosis and administrative data also demonstrated good agreement (κ = 0.68, 95% CI = 0.62-0.74). There was moderate agreement (κ = 0.41, 95% CI = 0.22-0.61) between referring clinicians and clinicians at TIA clinics receiving the referrals. Sixty percent of 748 patient referrals to TIA clinics were TIA mimics. CONCLUSIONS: Overall agreement between expert clinicians was good for TIA diagnosis, although variation still existed for a sizeable proportion of cases. Diagnostic agreement for TIA decreased among nonspecialists. The substantial number of patients being referred to TIA clinics with other (often neurologic) diagnoses was large, suggesting that clinicians, who are proficient in managing TIAs and their mimics, should run TIA clinics.
Authors: J M Ferro; I Falcão; G Rodrigues; P Canhão; T P Melo; V Oliveira; A N Pinto; M Crespo; A V Salgado Journal: Stroke Date: 1996-12 Impact factor: 7.914
Authors: Petra Sedova; Robert D Brown; Miroslav Zvolsky; Pavla Kadlecova; Tomas Bryndziar; Ondrej Volny; Viktor Weiss; Josef Bednarik; Robert Mikulik Journal: J Stroke Cerebrovasc Dis Date: 2015-06-30 Impact factor: 2.136
Authors: Miriam Brazzelli; Francesca M Chappell; Hector Miranda; Kirsten Shuler; Martin Dennis; Peter A G Sandercock; Keith Muir; Joanna M Wardlaw Journal: Ann Neurol Date: 2014-01-02 Impact factor: 10.422