| Literature DB >> 36245758 |
Alec Bernard1, Shang Zhou Xia2, Sahal Saleh1, Tochukwu Ndukwe1, Joshua Meyer2, Elliot Soloway2, Mandefro Sintayehu3, Blen Teshome Ramet4, Bezawit Tadegegne3, Christine Nelson5, Hakan Demirci5.
Abstract
Purpose: Early diagnosis and treatment of retinoblastoma are of paramount importance for a positive clinical outcome. The most common sign of retinoblastoma is leukocoria, or white pupil. Effective, easy-to-perform, community-based screening is needed to improve outcomes in lower-income regions. The EyeScreen (developed by Joshua Meyer from the University of Michigan) Android (Google LLC) smartphone application is an important step toward addressing this need. The purpose of this study was to examine the potential of the novel use of low-cost technologies-a cell phone application and machine learning-to identify leukocoria. Design: A cell phone application was developed and refined with the feedback from on-site, single-population use in Ethiopia. Application performance was evaluated in this technology validation study. Participants: One thousand four hundred fifty-seven participants were recruited from ophthalmology and pediatric clinics in Addis Ababa, Ethiopia.Entities:
Keywords: Application; Machine learning; ROC, receiver operating characteristic; Retinoblastoma; Screening
Year: 2022 PMID: 36245758 PMCID: PMC9560653 DOI: 10.1016/j.xops.2022.100158
Source DB: PubMed Journal: Ophthalmol Sci ISSN: 2666-9145
Figure 1Diagram showing basic screening steps in use of the EyeScreen application.
Figure 2Screenshots obtained by the authors showing the logo and user interface for the Android EyeScreen app (developed by Joshua Meyer from University of Michigan), before taking photographs.
Figure 3Ophthalmology-facing user interface used to label images for network training.
Demographics of Participants by Testing and Training Dataset and Total Image Count
| Participants Classified as Showing Normal Red Reflex | Participants Classified as Showing Abnormal Red Reflex | Total Images Captured (Eye Pairs) | |||
|---|---|---|---|---|---|
| Training | Testing | Training | Testing | ||
| Total no. | 944 | 236 | 222 | 55 | 4356 |
| Age (mos) | |||||
| Mean | 12.2 | 11.5 | 13.1 | 9.9 | |
| 0–2 (count) | 135 | 44 | 30 | 7 | 1300 |
| 3–12 (count) | 460 | 103 | 119 | 30 | 2300 |
| 13–19 (count) | 74 | 28 | 24 | 10 | 416 |
| 20–29 (count) | 26 | 5 | 7 | 1 | 244 |
| 30–89 (count) | 123 | 28 | 29 | 3 | 640 |
| Sex, no. (%) | |||||
| Female | 332 (35) | 79 (33) | 72 (32) | 25 (45) | |
| Male | 612 (65) | 157 (67) | 150 (68) | 30 (55) | |
Figure 4Sample images of (A–E) normal red reflex and (F–J) abnormal pupil reflex in model training and testing.
Figure 5Receiver operating characteristic (ROC) curve with area under the ROC curve (AUROC) for EyeScreen testing dataset.
Figure 6Precision-recall (PR) curve with area under the PR curve (AUPRC) for EyeScreen testing dataset. AUROC = receiver operating characteristic.
Confusion Matrix for Test Dataset
| Actual | Predicted | Total | |
|---|---|---|---|
| Negative | Positive | ||
| Negative | TN = 173 | FP = 63 | 236 |
| Positive | FN = 7 | TP = 48 | 55 |
| Total | 180 | 110 | |
FN = false-negative; FP = false-positive; TN = true-negative; TP = true-positive.
Figure 7Examples of incorrectly identified eyes in the EyeScreen test dataset.