Literature DB >> 23169725

Interrater agreement in the evaluation of discrepant imaging findings with the Radpeer system.

Leila C Bender1, Ken F Linnau, Eric N Meier, Yoshimi Anzai, Martin L Gunn.   

Abstract

OBJECTIVE: The Radpeer system is central to the quality assurance process in many radiology practices. Previous studies have shown poor agreement between physicians in the evaluation of their peers. The purpose of this study was to assess the reliability of the Radpeer scoring system.
MATERIALS AND METHODS: A sample of 25 discrepant cases was extracted from our quality assurance database. Images were made anonymous; associated reports and identities of interpreting radiologists were removed. Indications for the studies and descriptions of the discrepancies were provided. Twenty-one subspecialist attending radiologists rated the cases using the Radpeer scoring system. Multirater kappa statistics were used to assess interrater agreement, both with the standard scoring system and with dichotomized scores to reflect the practice of further review for cases rated 3 and 4. Subgroup analyses were conducted to assess subspecialist evaluation of cases.
RESULTS: Interrater agreement was slight to fair compared with that expected by chance. For the group of 21 raters, the kappa values were 0.11 (95% CI, 0.06-0.16) with the standard scoring system and 0.20 (95% CI, 0.13-0.27) with dichotomized scores. There was disagreement about whether a discrepancy had occurred in 20 cases. Subgroup analyses did not reveal significant differences in the degree of interrater agreement.
CONCLUSION: The identification of discrepant interpretations is valuable for the education of individual radiologists and for larger-scale quality assurance and quality improvement efforts. Our results show that a ratings-based peer review system is unreliable and subjective for the evaluation of discrepant interpretations. Resources should be devoted to developing more robust and objective assessment procedures, particularly those with clear quality improvement goals.

Mesh:

Year:  2012        PMID: 23169725     DOI: 10.2214/AJR.12.8972

Source DB:  PubMed          Journal:  AJR Am J Roentgenol        ISSN: 0361-803X            Impact factor:   3.959


  13 in total

1.  The quality movement or making radiology fun again.

Authors:  C Craig Blackmore
Journal:  Emerg Radiol       Date:  2015-02-12

2.  The "Reading the Mind in the Eyes" Test: Investigation of Psychometric Properties and Test-Retest Reliability of the Persian Version.

Authors:  Behzad S Khorashad; Simon Baron-Cohen; Ghasem M Roshan; Mojtaba Kazemian; Ladan Khazai; Zahra Aghili; Ali Talaei; Mozhgan Afkhamizadeh
Journal:  J Autism Dev Disord       Date:  2015-09

3.  Survey of peer review programs among pediatric radiologists: report from the SPR Quality and Safety Committee.

Authors:  Ramesh S Iyer; David W Swenson; Neil Anand; Einat Blumfield; Tushar Chandra; Govind B Chavhan; Thomas R Goodman; Naeem Khan; Michael M Moore; Thang D Ngo; Christina L Sammet; Raymond W Sze; Chido D Vera; A Luana Stanescu
Journal:  Pediatr Radiol       Date:  2019-03-29

4.  PathBot: A Radiology-Pathology Correlation Dashboard.

Authors:  Linda C Kelahan; Amit D Kalaria; Ross W Filice
Journal:  J Digit Imaging       Date:  2017-12       Impact factor: 4.056

5.  Implementation and Validation of PACS Integrated Peer Review for Discrepancy Recording of Radiology Reporting.

Authors:  A W Olthof; P M A van Ooijen
Journal:  J Med Syst       Date:  2016-07-21       Impact factor: 4.460

6.  Interobserver agreement in the interpretation of outpatient head CT scans in an academic neuroradiology practice.

Authors:  G Guérin; S Jamali; C A Soto; F Guilbert; J Raymond
Journal:  AJNR Am J Neuroradiol       Date:  2014-07-24       Impact factor: 3.825

7.  A workstation-integrated peer review quality assurance program: pilot study.

Authors:  Margaret M O'Keeffe; Todd M Davis; Kerry Siminoski
Journal:  BMC Med Imaging       Date:  2013-07-04       Impact factor: 1.930

8.  Radiologist-initiated double reading of abdominal CT: retrospective analysis of the clinical importance of changes to radiology reports.

Authors:  Peter Mæhre Lauritzen; Jack Gunnar Andersen; Mali Victoria Stokke; Anne Lise Tennstrand; Rolf Aamodt; Thomas Heggelund; Fredrik A Dahl; Gunnar Sandbæk; Petter Hurlen; Pål Gulbrandsen
Journal:  BMJ Qual Saf       Date:  2016-03-24       Impact factor: 7.035

Review 9.  Added value of double reading in diagnostic radiology,a systematic review.

Authors:  Håkan Geijer; Mats Geijer
Journal:  Insights Imaging       Date:  2018-03-28

Review 10.  Error and discrepancy in radiology: inevitable or avoidable?

Authors:  Adrian P Brady
Journal:  Insights Imaging       Date:  2016-12-07
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.