PURPOSE: To compare the interobserver agreement in detecting glaucomatous optic disc changes using serial stereophotography between a large group of non-expert ophthalmologists and glaucoma specialists; to assess the accuracy of non-experts; to investigate whether the interobserver agreement and the accuracy of non-experts changed after a training session. DESIGN: Masked interobserver agreement study. PARTICIPANTS: Serial optic disc stereophotos from 40 patients with glaucoma. METHODS: Three independent experienced glaucoma specialists (readers of the European Glaucoma Prevention Study) evaluated a set of 2 serial optic disc color stereo-slides for glaucomatous change, obtained with a delay varying from 2 to 7 years of 40 patients, masked from the temporal sequence of the slides. Each patient was graded as changed or stable by agreement of 2 of 3 of the experts (the reference standard). Thirty-seven non-expert ophthalmologists independently evaluated the same set of serial optic disc stereo-slides twice, with the second evaluation on the same day, masked from the results of the previous evaluation, after a training session on a separate slide set. MAIN OUTCOME MEASURES: Interobserver agreement of non-experts and experts in detecting glaucomatous optic disc changes (expressed as kappa coefficient); agreement of non-experts with the reference standard (accuracy) before and after a training session. RESULTS: The interobserver kappa coefficient (κ) of the non-experts and experts was 0.20 (95% confidence interval [CI], 0.19-0.21) and 0.51 (95% CI, 0.33-0.69), respectively (P<0.0001). The mean κ of the non-experts with the reference standard was 0.33 (95% CI, 0.27-0.39). After a training session, the interobserver agreement of the non-experts increased from 0.20 to 0.27 (95% CI, 0.26-0.28) (P<0.0001). The percentage agreement of the non-experts with the reference standard improved from 68.5% before to 71.4% after the training session (Wilcoxon signed-rank test, P=0.034). CONCLUSIONS: The interobserver agreement of non-expert ophthalmologists in detecting glaucomatous optic disc changes using serial stereophotos was significantly lower than that of experts, which was moderate. After a training session, the interobserver agreement and the accuracy of the non-experts showed a small but statistically significant improvement.
PURPOSE: To compare the interobserver agreement in detecting glaucomatous optic disc changes using serial stereophotography between a large group of non-expert ophthalmologists and glaucoma specialists; to assess the accuracy of non-experts; to investigate whether the interobserver agreement and the accuracy of non-experts changed after a training session. DESIGN: Masked interobserver agreement study. PARTICIPANTS: Serial optic disc stereophotos from 40 patients with glaucoma. METHODS: Three independent experienced glaucoma specialists (readers of the European Glaucoma Prevention Study) evaluated a set of 2 serial optic disc color stereo-slides for glaucomatous change, obtained with a delay varying from 2 to 7 years of 40 patients, masked from the temporal sequence of the slides. Each patient was graded as changed or stable by agreement of 2 of 3 of the experts (the reference standard). Thirty-seven non-expert ophthalmologists independently evaluated the same set of serial optic disc stereo-slides twice, with the second evaluation on the same day, masked from the results of the previous evaluation, after a training session on a separate slide set. MAIN OUTCOME MEASURES: Interobserver agreement of non-experts and experts in detecting glaucomatous optic disc changes (expressed as kappa coefficient); agreement of non-experts with the reference standard (accuracy) before and after a training session. RESULTS: The interobserver kappa coefficient (κ) of the non-experts and experts was 0.20 (95% confidence interval [CI], 0.19-0.21) and 0.51 (95% CI, 0.33-0.69), respectively (P<0.0001). The mean κ of the non-experts with the reference standard was 0.33 (95% CI, 0.27-0.39). After a training session, the interobserver agreement of the non-experts increased from 0.20 to 0.27 (95% CI, 0.26-0.28) (P<0.0001). The percentage agreement of the non-experts with the reference standard improved from 68.5% before to 71.4% after the training session (Wilcoxon signed-rank test, P=0.034). CONCLUSIONS: The interobserver agreement of non-expert ophthalmologists in detecting glaucomatous optic disc changes using serial stereophotos was significantly lower than that of experts, which was moderate. After a training session, the interobserver agreement and the accuracy of the non-experts showed a small but statistically significant improvement.
Authors: Christopher Bowd; Intae Lee; Michael H Goldbaum; Madhusudhanan Balasubramanian; Felipe A Medeiros; Linda M Zangwill; Christopher A Girkin; Jeffrey M Liebmann; Robert N Weinreb Journal: Invest Ophthalmol Vis Sci Date: 2012-04-30 Impact factor: 4.799
Authors: Renato Lisboa; Augusto Paranhos; Robert N Weinreb; Linda M Zangwill; Mauro T Leite; Felipe A Medeiros Journal: Invest Ophthalmol Vis Sci Date: 2013-05-13 Impact factor: 4.799
Authors: Sapna S Gangaputra; Michael M Altaweel; Qian Peng; David S Friedman; P Kumar Rao; C Stephen Foster; Rosa Y Kim; Susan B Reed; Sunil K Srivastava; Ira G Wong; John H Kempen Journal: Ocul Immunol Inflamm Date: 2011-08 Impact factor: 3.070
Authors: Renato Lisboa; Mauro T Leite; Linda M Zangwill; Ali Tafreshi; Robert N Weinreb; Felipe A Medeiros Journal: Ophthalmology Date: 2012-08-09 Impact factor: 12.079
Authors: Jamie L Schaefer; Zachary L Lukowski; Alissa M Meyer; Anthony J Leoncavallo; Anthony Greer; Gina M Martorana; Baiming Zou; Jonathan J Shuster; Mark B Sherwood Journal: Am J Ophthalmol Date: 2016-03-31 Impact factor: 5.258
Authors: Danny Mitry; Tunde Peto; Shabina Hayat; Peter Blows; James Morgan; Kay-Tee Khaw; Paul J Foster Journal: PLoS One Date: 2015-02-18 Impact factor: 3.240