| Literature DB >> 30387908 |
Sherry G Mansour1,2, Isaac E Hall3, Peter P Reese4, Yaqi Jia1,5, Heather Thiessen-Philbrook1,5, Gilbert Moeckel6, Francis L Weng7, Monica P Revelo8, Mazdak A Khalighi8, Anshu Trivedi9, Mona D Doshi10, Bernd Schröppel11, Chirag R Parikh5.
Abstract
Prior studies demonstrate poor agreement among pathologists' interpretation of kidney biopsy slides. Reliability of representative images of these slides uploaded to the United Network of Organ Sharing (UNOS) web portal for clinician review has not been studied. We hypothesized high agreement among pathologists' image interpretation, since static images eliminate variation induced by viewing different areas of movable slides. To test our hypothesis, we compared the assessments of UNOS-uploaded images recorded in standardized forms by three pathologists. We selected 100 image sets, each having at least two images from kidneys of deceased donors. Weighted Cohen's kappa was used for inter-rater agreement. Mean (SD) donor age was 50 (13). Acute tubular injury had kappas of 0.12, 0.14, and 0.19; arteriolar hyalinosis 0.16, 0.27, and 0.38; interstitial inflammation 0.30, 0.33, and 0.49; interstitial fibrosis 0.28, 0.32, and 0.67; arterial intimal fibrosis 0.34, 0.42, and 0.59; tubular atrophy 0.35, 0.41, and 0.52; glomeruli thrombi 0.32, 0.53, and 0.85; and global glomerulosclerosis 0.68, 0.70, and 0.77. Pathologists' agreement demonstrated kappas of 0.12 to 0.77. The lower values raise concern about the reliability of using images. Although further research is needed to understand how uploaded images are used clinically, the field may consider higher-quality standards for biopsy photomicrographs.Entities:
Keywords: agreement; biopsy; deceased donor; images; kappa; pathologists; renal; transplant
Mesh:
Year: 2018 PMID: 30387908 PMCID: PMC6317379 DOI: 10.1111/ctr.13441
Source DB: PubMed Journal: Clin Transplant ISSN: 0902-0063 Impact factor: 2.863