| Literature DB >> 32832206 |
Huazhu Fu1, Fei Li2, Yanwu Xu3, Jingan Liao4, Jian Xiong2, Jianbing Shen1, Jiang Liu5,6, Xiulan Zhang2.
Abstract
Purpose: Optic disc (OD) and optic cup (OC) segmentation are fundamental for fundus image analysis. Manual annotation is time consuming, expensive, and highly subjective, whereas an automated system is invaluable to the medical community. The aim of this study is to develop a deep learning system to segment OD and OC in fundus photographs, and evaluate how the algorithm compares against manual annotations.Entities:
Keywords: artificial intelligence; optic cup; optic disc; segmentation
Mesh:
Year: 2020 PMID: 32832206 PMCID: PMC7414704 DOI: 10.1167/tvst.9.2.33
Source DB: PubMed Journal: Transl Vis Sci Technol ISSN: 2164-2591 Impact factor: 3.283
Figure 1.Deep learning system for automated segmentation in fundus images. The algorithm included two stages: optic disc (OD) region detection and OD and optic cup (OC) segmentation. For a given fundus image (a), the U-Net network (b) was utilized to detect the OD region (c). With the cropped OD region (d), a polar transformation was used to map the image into polar coordinate (e). A multilabel network (f) segmented the OD and OC jointly, and an inverse transformation returned the output map (g) back to original coordinates (h).
Segmentation Performances of Ophthalmologists and Algorithm on Test Set
| OD Dice (SD, 95% CI) | OC Dice (SD, 95% CI) | CDR MAE (SD, 95% CI) | |
|---|---|---|---|
| Glaucoma data (40 images) | |||
| Ophthalmologist 1 | 0.963 (0.026, 0.957-0.970) | 0.880 (0.090, 0.857-0.904) | 0.057 (0.055, 0.043-0.071) |
| Ophthalmologist 2 | 0.945 (0.034, 0.936-0.954) | 0.867 (0.090, 0.843-0.890) | 0.057 (0.045, 0.045-0.069) |
| Ophthalmologist 3 | 0.944 (0.031, 0.936-0.952) | 0.888 (0.072, 0.869-0.907) | 0.039 (0.035, 0.030-0.048) |
| Ophthalmologist 4 | 0.953 (0.027, 0.946-0.960) | 0.884 (0.071, 0.865-0.902) | 0.052 (0.040, 0.042-0.063) |
| Ophthalmologist 5 | 0.947 (0.030, 0.939-0.955) | 0.865 (0.089, 0.842-0.889) | 0.055 (0.054, 0.041-0.069) |
| Ophthalmologist 6 | 0.954 (0.030, 0.946-0.962) | 0.904 (0.064, 0.887-0.921) | 0.048 (0.042, 0.037-0.059) |
| Ophthalmologist 7 | 0.954 (0.030, 0.947-0.962) | 0.719 (0.138, 0.683-0.755) | 0.149 (0.091, 0.125-0.173) |
| Algorithm | 0.941 (0.057, 0.926-0.956) | 0.864 (0.089, 0.841-0.887) | 0.065 (0.056, 0.051-0.080) |
| Nonglaucoma data (360 images) | |||
| Ophthalmologist 1 | 0.956 (0.028, 0.953-0.958) | 0.686 (0.107, 0.677-0.695) | 0.157 (0.071, 0.151-0.163) |
| Ophthalmologist 2 | 0.926 (0.047, 0.922-0.931) | 0.842 (0.086, 0.834-0.849) | 0.053 (0.040, 0.049-0.056) |
| Ophthalmologist 3 | 0.922 (0.040, 0.918-0.925) | 0.837 (0.078, 0.831-0.844) | 0.041 (0.032, 0.038-0.044) |
| Ophthalmologist 4 | 0.949 (0.033, 0.946-0.952) | 0.801 (0.117, 0.791-0.811) | 0.056 (0.043, 0.053-0.060) |
| Ophthalmologist 5 | 0.945 (0.035, 0.942-0.948) | 0.870 (0.082, 0.863-0.877) | 0.041 (0.036, 0.038-0.044) |
| Ophthalmologist 6 | 0.953 (0.034, 0.950-0.956) | 0.903 (0.064, 0.898-0.909) | 0.034 (0.030, 0.031-0.037) |
| Ophthalmologist 7 | 0.955 (0.032, 0.952-0.958) | 0.664 (0.138, 0.652-0.676) | 0.130 (0.063, 0.125-0.136) |
| Algorithm | 0.937 (0.040, 0.934-0.941) | 0.794 (0.096, 0.786-0.803) | 0.079 (0.050, 0.074-0.083) |
| All data (400 images) | |||
| Ophthalmologist 1 | 0.956 (0.028, 0.954-0.959) | 0.705 (0.121, 0.695-0.715) | 0.147 (0.075, 0.141-0.153) |
| Ophthalmologist 2 | 0.928 (0.046, 0.925-0.932) | 0.844 (0.086, 0.837-0.851) | 0.053 (0.040, 0.050-0.056) |
| Ophthalmologist 3 | 0.924 (0.039, 0.921-0.927) | 0.843 (0.079, 0.836-0.849) | 0.041 (0.032, 0.038-0.044) |
| Ophthalmologist 4 | 0.949 (0.032, 0.947-0.952) | 0.809 (0.116, 0.800-0.819) | 0.056 (0.042, 0.052-0.059) |
| Ophthalmologist 5 | 0.945 (0.035, 0.942-0.948) | 0.870 (0.082, 0.863-0.876) | 0.043 (0.038, 0.040-0.046) |
| Ophthalmologist 6 | 0.953 (0.034, 0.950-0.956) | 0.903 (0.064, 0.898-0.909) | 0.035 (0.032, 0.033-0.038) |
| Ophthalmologist 7 | 0.955 (0.031, 0.952-0.957) | 0.670 (0.138, 0.658-0.681) | 0.132 (0.066, 0.127-0.138) |
| Algorithm | 0.938 (0.041, 0.934-0.941) | 0.801 (0.097, 0.793-0.809) | 0.077 (0.051, 0.073-0.082) |
OD, optic disc; OC, optic cup; CDR, cup-to-disc ratio; MAE, mean absolute error; CI, confidence interval.
Figure 2.Dice scores of inter-agreement for seven ophthalmologists on the test set for (A) optic disc, and (B) optic cup.
Figure 3.Boxplots of the calculated cup-to-disc ratio (CDR) from segmentation results on test set.
Figure 4.The visual results of segmentation. The segmented optic disc and optic cup regions were labeled by red and yellow colors, respectively. (a, b) Glaucoma cases, (c, d) nonglaucoma cases, and (e, f) failure cases.
Figure 5.(A) The average receiver operating characteristic curves (AUC) for glaucoma diagnosis based on cup to disc ratio (CDR) on test set. (B) The precision-recall curves for glaucoma diagnosis based on cup to disc ratio (CDR) on test set.
Diagnosis Performances of Experts and Algorithm on Test Set
| AUC (95% CI) |
| Sensitivity (95% CI) | Specificity (95% CI) | Precision (95% CI) | |
|---|---|---|---|---|---|
| Reference label | 0.946 (0.911-0.974) | < 0.0001 | 0.875 (0.811-0.927) | 0.867 (0.819-0.923) | 0.700 (0.588-0.791) |
| Ophthalmologist 1 | 0.922 (0.884-0.955) | < 0.0001 | 0.875 (0.800-0.897) | 0.858 (0.812-0.887) | 0.585 (0.464-0.694) |
| Ophthalmologist 2 | 0.956 (0.933-0.975) | < 0.0001 | 0.875 (0.826-0.930) | 0.869 (0.834-0.928) | 0.700 (0.577-0.786) |
| Ophthalmologist 3 | 0.935 (0.902-0.964) | < 0.0001 | 0.850 (0.789-0.913) | 0.850 (0.795-0.908) | 0.650 (0.556-0.771) |
| Ophthalmologist 4 | 0.916 (0.871-0.954) | < 0.0001 | 0.850 (0.791-0.903) | 0.847 (0.796-0.898) | 0.575 (0.438-0.667) |
| Ophthalmologist 5 | 0.911 (0.866-0.951) | < 0.0001 | 0.850 (0.756-0.892) | 0.856 (0.755-0.886) | 0.625 (0.525-0.755) |
| Ophthalmologist 6 | 0.947 (0.916-0.972) | < 0.0001 | 0.900 (0.824-0.930) | 0.881 (0.832-0.922) | 0.659 (0.556-0.773) |
| Ophthalmologist 7 | 0.916 (0.876-0.951) | < 0.0001 | 0.825 (0.765-0.882) | 0.825 (0.772-0.878) | 0.625 (0.469-0.720) |
| Algorithm | 0.948 (0.920-0.973) | – | 0.850 (0.794-0.923) | 0.853 (0.798-0.918) | 0.700 (0.600-0.800) |
AUC, area under the receiver operating characteristic curve; CI, confidence interval.