PURPOSE: To compare diabetic retinopathy (DR) referral recommendations made by viewing fundus images using a tablet computer with those made using a standard desktop display. METHODS: A tablet computer (iPad) and a desktop computer with a high-definition color display were compared. For each platform, 2 retinal specialists independently rated 1,200 color fundus images from patients at risk for DR using an annotation program Truthseeker. The specialists determined whether each image had referable DR and also how urgently each patient should be referred for medical examination. Graders viewed and rated the randomly presented images independently and were masked to their ratings on the alternative platform. Tablet-based and desktop display-based referral ratings were compared using cross-platform intraobserver kappa as the primary outcome measure. Additionally, interobserver kappa, sensitivity, specificity, and area under the receiver operating characteristic were determined. RESULTS: A high level of cross-platform intraobserver agreement was found for the DR referral ratings between the platforms (κ = 0.778) and for the 2 graders (κ = 0.812). Interobserver agreement was similar for the 2 platforms (κ = 0.544 and κ = 0.625 for tablet and desktop, respectively). The tablet-based ratings achieved a sensitivity of 0.848, a specificity of 0.987, and an area under the receiver operating characteristic of 0.950 compared with desktop display-based ratings. CONCLUSION: In this pilot study, tablet-based rating of color fundus images for subjects at risk for DR was consistent with desktop display-based rating. These results indicate that tablet computers can be reliably used for clinical evaluation of fundus images for DR.
PURPOSE: To compare diabetic retinopathy (DR) referral recommendations made by viewing fundus images using a tablet computer with those made using a standard desktop display. METHODS: A tablet computer (iPad) and a desktop computer with a high-definition color display were compared. For each platform, 2 retinal specialists independently rated 1,200 color fundus images from patients at risk for DR using an annotation program Truthseeker. The specialists determined whether each image had referable DR and also how urgently each patient should be referred for medical examination. Graders viewed and rated the randomly presented images independently and were masked to their ratings on the alternative platform. Tablet-based and desktop display-based referral ratings were compared using cross-platform intraobserver kappa as the primary outcome measure. Additionally, interobserver kappa, sensitivity, specificity, and area under the receiver operating characteristic were determined. RESULTS: A high level of cross-platform intraobserver agreement was found for the DR referral ratings between the platforms (κ = 0.778) and for the 2 graders (κ = 0.812). Interobserver agreement was similar for the 2 platforms (κ = 0.544 and κ = 0.625 for tablet and desktop, respectively). The tablet-based ratings achieved a sensitivity of 0.848, a specificity of 0.987, and an area under the receiver operating characteristic of 0.950 compared with desktop display-based ratings. CONCLUSION: In this pilot study, tablet-based rating of color fundus images for subjects at risk for DR was consistent with desktop display-based rating. These results indicate that tablet computers can be reliably used for clinical evaluation of fundus images for DR.
Authors: Meindert Niemeijer; Marco Loog; Michael David Abramoff; Max A Viergever; Mathias Prokop; Bram van Ginneken Journal: IEEE Trans Med Imaging Date: 2010-09-02 Impact factor: 10.048
Authors: M F Chiang; J Starren; Y E Du; J D Keenan; W M Schiff; G R Barile; J Li; R A Johnson; D J Hess; J T Flynn Journal: Br J Ophthalmol Date: 2006-04-13 Impact factor: 4.638
Authors: Michael D Abràmoff; Wallace L M Alward; Emily C Greenlee; Lesya Shuba; Chan Y Kim; John H Fingert; Young H Kwon Journal: Invest Ophthalmol Vis Sci Date: 2007-04 Impact factor: 4.799
Authors: C P Wilkinson; Frederick L Ferris; Ronald E Klein; Paul P Lee; Carl David Agardh; Matthew Davis; Diana Dills; Anselm Kampik; R Pararajasegaram; Juan T Verdaguer Journal: Ophthalmology Date: 2003-09 Impact factor: 12.079
Authors: P H Scanlon; R Malhotra; G Thomas; C Foy; J N Kirkpatrick; N Lewis-Barned; B Harney; S J Aldington Journal: Diabet Med Date: 2003-06 Impact factor: 4.359
Authors: Meindert Niemeijer; Bram van Ginneken; Stephen R Russell; Maria S A Suttorp-Schulten; Michael D Abràmoff Journal: Invest Ophthalmol Vis Sci Date: 2007-05 Impact factor: 4.799
Authors: J A Olson; F M Strachan; J H Hipwell; K A Goatman; K C McHardy; J V Forrester; P F Sharp Journal: Diabet Med Date: 2003-07 Impact factor: 4.359
Authors: Xinjian Chen; Meindert Niemeijer; Li Zhang; Kyungmoo Lee; Michael D Abramoff; Milan Sonka Journal: IEEE Trans Med Imaging Date: 2012-03-19 Impact factor: 10.048
Authors: Nathaniel K Mullin; Kristin R Anfinson; Megan J Riker; Kelsey L Wieland; Nicole J Tatro; Todd E Scheetz; Robert F Mullins; Edwin M Stone; Budd A Tucker Journal: Hum Mol Genet Date: 2022-03-03 Impact factor: 5.121
Authors: Vinayak S Joshi; Richard J Maude; Joseph M Reinhardt; Li Tang; Mona K Garvin; Abdullah Abu Sayeed; Aniruddha Ghose; Mahtab Uddin Hassan; Michael D Abràmoff Journal: Invest Ophthalmol Vis Sci Date: 2012-09-25 Impact factor: 4.799
Authors: Amber A van der Heijden; Michael D Abramoff; Frank Verbraak; Manon V van Hecke; Albert Liem; Giel Nijpels Journal: Acta Ophthalmol Date: 2017-11-27 Impact factor: 3.761
Authors: Xiayu Xu; Joseph M Reinhardt; Qiao Hu; Benjamin Bakall; Paul S Tlucek; Geir Bertelsen; Michael D Abràmoff Journal: PLoS One Date: 2012-11-27 Impact factor: 3.240