Literature DB >> 20129274

A reference standard-based quality assurance program for radiology.

Patrick T Liu1, C Daniel Johnson, Rafael Miranda, Maitray D Patel, Carrie J Phillips.   

Abstract

The authors have developed a comprehensive radiology quality assurance (QA) program that evaluates radiology interpretations and procedures by comparing them with reference standards. Performance metrics are calculated and then compared with benchmarks or goals on the basis of published multicenter data and meta-analyses. Additional workload for physicians is kept to a minimum by having trained allied health staff members perform the comparisons of radiology reports with the reference standards. The performance metrics tracked by the QA program include the accuracy of CT colonography for detecting polyps, the false-negative rate for mammographic detection of breast cancer, the accuracy of CT angiography detection of coronary artery stenosis, the accuracy of meniscal tear detection on MRI, the accuracy of carotid artery stenosis detection on MR angiography, the accuracy of parathyroid adenoma detection by parathyroid scintigraphy, the success rate for obtaining cortical tissue on ultrasound-guided core biopsies of pelvic renal transplants, and the technical success rate for peripheral arterial angioplasty procedures. In contrast with peer-review programs, this reference standard-based QA program minimizes the possibilities of reviewer bias and erroneous second reviewer interpretations. The more objective assessment of performance afforded by the QA program will provide data that can easily be used for education and management conferences, research projects, and multicenter evaluations. Additionally, such performance data could be used by radiology departments to demonstrate their value over nonradiology competitors to referring clinicians, hospitals, patients, and third-party payers. Copyright 2010 American College of Radiology. Published by Elsevier Inc. All rights reserved.

Entities:  

Mesh:

Year:  2010        PMID: 20129274     DOI: 10.1016/j.jacr.2009.08.016

Source DB:  PubMed          Journal:  J Am Coll Radiol        ISSN: 1546-1440            Impact factor:   5.532


  5 in total

1.  Interrater variation in scoring radiological discrepancies.

Authors:  B Mucci; H Murray; A Downie; K Osborne
Journal:  Br J Radiol       Date:  2013-07-05       Impact factor: 3.039

2.  Developing an automated database for monitoring ultrasound- and computed tomography-guided procedure complications and diagnostic yield.

Authors:  Jason N Itri; Lisa P Jones; Woojin Kim; William W Boonn; Ana S Kolansky; Susan Hilton; Hanna M Zafar
Journal:  J Digit Imaging       Date:  2014-04       Impact factor: 4.056

3.  Unbiased review of digital diagnostic images in practice: informatics prototype and pilot study.

Authors:  Anthony F Fotenos; Nabile M Safdar; Paul G Nagy; Reuben Mezrich; Jonathan S Lewin
Journal:  Acad Radiol       Date:  2012-10-26       Impact factor: 3.173

4.  A computer-assisted systematic quality monitoring method for cervical hip fracture radiography.

Authors:  Mats Geijer; Olof Laurin; Ragnar Johnsson; Sven Laurin
Journal:  Acta Radiol Open       Date:  2016-12-05

Review 5.  Added value of double reading in diagnostic radiology,a systematic review.

Authors:  Håkan Geijer; Mats Geijer
Journal:  Insights Imaging       Date:  2018-03-28
  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.