PURPOSE: To measure agreement and accuracy of plus disease diagnosis among retinopathy of prematurity (ROP) experts; and to compare expert performance to that of a computer-based analysis system, Retinal Image multiScale Analysis. METHODS: Twenty-two recognized ROP experts independently interpreted a set of 34 wide-angle retinal photographs for presence of plus disease. Diagnostic agreement was analyzed. A reference standard was defined based on majority vote of experts. Images were analyzed using individual and linear combinations of computer-based system parameters for arterioles and venules: integrated curvature (IC), diameter, and tortuosity index (TI). Sensitivity, specificity, and receiver operating characteristic areas under the curve (AUC) for plus disease diagnosis were determined for each expert and for the computer-based system. RESULTS: Mean kappa statistic for each expert compared to all others was between 0 and 0.20 (slight agreement) in 1 expert (4.5%), 0.21 and 0.40 (fair agreement) in 3 experts (13.6%), 0.41 and 0.60 (moderate agreement) in 12 experts (54.5%), and 0.61 and 0.80 (substantial agreement) in 6 experts (27.3%). For the 22 experts, sensitivity compared to the reference standard ranged from 0.308 to 1.000, specificity from 0.571 to 1.000, and AUC from 0.784 to 1.000. Among individual computer system parameters compared to the reference standard, venular IC had highest AUC (0.853). Among linear combinations of parameters, the combination of arteriolar IC, arteriolar TI, venular IC, venular diameter, and venular TI had highest AUC (0.967). CONCLUSION: Agreement and accuracy of plus disease diagnosis among ROP experts are imperfect. A computer-based system has potential to perform with comparable or better accuracy than human experts, but further validation is required.
PURPOSE: To measure agreement and accuracy of plus disease diagnosis among retinopathy of prematurity (ROP) experts; and to compare expert performance to that of a computer-based analysis system, Retinal Image multiScale Analysis. METHODS: Twenty-two recognized ROP experts independently interpreted a set of 34 wide-angle retinal photographs for presence of plus disease. Diagnostic agreement was analyzed. A reference standard was defined based on majority vote of experts. Images were analyzed using individual and linear combinations of computer-based system parameters for arterioles and venules: integrated curvature (IC), diameter, and tortuosity index (TI). Sensitivity, specificity, and receiver operating characteristic areas under the curve (AUC) for plus disease diagnosis were determined for each expert and for the computer-based system. RESULTS: Mean kappa statistic for each expert compared to all others was between 0 and 0.20 (slight agreement) in 1 expert (4.5%), 0.21 and 0.40 (fair agreement) in 3 experts (13.6%), 0.41 and 0.60 (moderate agreement) in 12 experts (54.5%), and 0.61 and 0.80 (substantial agreement) in 6 experts (27.3%). For the 22 experts, sensitivity compared to the reference standard ranged from 0.308 to 1.000, specificity from 0.571 to 1.000, and AUC from 0.784 to 1.000. Among individual computer system parameters compared to the reference standard, venular IC had highest AUC (0.853). Among linear combinations of parameters, the combination of arteriolar IC, arteriolar TI, venular IC, venular diameter, and venular TI had highest AUC (0.967). CONCLUSION: Agreement and accuracy of plus disease diagnosis among ROP experts are imperfect. A computer-based system has potential to perform with comparable or better accuracy than human experts, but further validation is required.
Authors: Chace Moleta; J Peter Campbell; Jayashree Kalpathy-Cramer; R V Paul Chan; Susan Ostmo; Karyn Jonas; Michael F Chiang Journal: Am J Ophthalmol Date: 2017-01-11 Impact factor: 5.258
Authors: Michael F Chiang; Rony Gelman; Steven L Williams; Joo-Yeon Lee; Daniel S Casper; M Elena Martinez-Perez; John T Flynn Journal: Invest Ophthalmol Vis Sci Date: 2008-04-11 Impact factor: 4.799
Authors: Ebenezer Daniel; Graham E Quinn; P Lloyd Hildebrand; Anna Ells; G Baker Hubbard; Antonio Capone; E Revell Martin; Candace P Ostroff; Eli Smith; Maxwell Pistilli; Gui-Shuang Ying Journal: JAMA Ophthalmol Date: 2015-06 Impact factor: 7.389
Authors: Leah A Wittenberg; Nina J Jonsson; R V Paul Chan; Michael F Chiang Journal: J Pediatr Ophthalmol Strabismus Date: 2011-03-01 Impact factor: 1.402
Authors: J Peter Campbell; Michael C Ryan; Emily Lore; Peng Tian; Susan Ostmo; Karyn Jonas; R V Paul Chan; Michael F Chiang Journal: Ophthalmology Date: 2016-05-27 Impact factor: 12.079
Authors: J Peter Campbell; Jayashree Kalpathy-Cramer; Deniz Erdogmus; Peng Tian; Dharanish Kedarisetti; Chace Moleta; James D Reynolds; Kelly Hutcheson; Michael J Shapiro; Michael X Repka; Philip Ferrone; Kimberly Drenser; Jason Horowitz; Kemal Sonmez; Ryan Swan; Susan Ostmo; Karyn E Jonas; R V Paul Chan; Michael F Chiang Journal: Ophthalmology Date: 2016-08-31 Impact factor: 12.079
Authors: Alexandra Lajoie; Susan Koreen; Lu Wang; Steven A Kane; Thomas C Lee; David J Weissgold; Audina M Berrocal; Yunling E Du; Osode Coki; John T Flynn; Justin Starren; Michael F Chiang Journal: Am J Ophthalmol Date: 2008-06-10 Impact factor: 5.258
Authors: Marguerite C Weinert; David K Wallace; Sharon F Freedman; J Wayne Riggins; Keith J Gallaher; S Grace Prakalapakorn Journal: J AAPOS Date: 2020-03-27 Impact factor: 1.220