| Literature DB >> 34945039 |
Alexandra Miere1, Olivia Zambrowski1, Arthur Kessler2, Carl-Joe Mehanna1, Carlotta Pallone1, Daniel Seknazi1, Paul Denys1, Francesca Amoroso1, Eric Petit3, Eric H Souied1.
Abstract
(1) Background: Recessive Stargardt disease (STGD1) and multifocal pattern dystrophy simulating Stargardt disease ("pseudo-Stargardt pattern dystrophy", PSPD) share phenotypic similitudes, leading to a difficult clinical diagnosis. Our aim was to assess whether a deep learning classifier pretrained on fundus autofluorescence (FAF) images can assist in distinguishing ABCA4-related STGD1 from the PRPH2/RDS-related PSPD and to compare the performance with that of retinal specialists. (2)Entities:
Keywords: deep learning; fundus autofluorescence; inherited retinal diseases; retinal imaging
Year: 2021 PMID: 34945039 PMCID: PMC8708395 DOI: 10.3390/jcm10245742
Source DB: PubMed Journal: J Clin Med ISSN: 2077-0383 Impact factor: 4.241
The split of the dataset for each class consisted of a training set (60%), a validation set (10%) and a test set (30%). (n)—number of fundus autofluorescence (FAF) images in each dataset.
| Training Set | Validation Set | Test Set | Total | |
|---|---|---|---|---|
| Stargardt disease ( | 183 | 30 | 91 | 304 |
| Pseudo-Stargardt Pattern Dystrophy ( | 40 | 6 | 20 | 66 |
| Total ( | 223 | 36 | 111 | 370 |
Figure 1Illustration of the development of the deep learning model used. Various fundus autofluorescence images were extracted from the Créteil database and were used to train ResNet50V2. The pretrained network was used to classify genetically confirmed STGD1 and PSPD FAF images. The images were randomly partitioned into three sets: the training set (60% of the images), the validation set (10% of the images), and the test set (30% of the images). Data augmentation was performed on the training set to increase the original dataset and to reduce overfitting of the final model. The model was optimized using the Adam Optimization Algorithm. The model was then evaluated with the test set of 111 images. The output of the model was the metric evaluation of the performance of the model (accuracy, sensitivity, specificity, precision, recall, F1-score) and integrated gradient visualization.
Demographic and genetic data.
| Patient | Age | Sex | Mutation | |
|---|---|---|---|---|
| #1 | 49 | M | c.639c > G (p.Cys213Trp) | |
| #2 | 50 | F | c.639c > G (p.Cys213Trp) | |
| #3 | 51 | F | c623G > A (p.Gly208Asp) | |
| #4 | 83 | M | c.461del (p.Lys154Argfs*102) | |
| #5 | 54 | M | c.461del (p.Lys154Argfs*102) | |
| #6 | 49 | M | c.628C > G (p.Pro210Ala) | |
| #7 | 43 | M | NA | |
| #8 | 39 | F | NA | |
| #9 | 43 | F | NA | |
| #1 | 50 | M | c.3259G > A (p.Glu1087Lys) | |
| #2 | 36 | M | c.1749G > C (p.Lys583Asn) | |
| #3 | 36 | F | c.1222C > T (p.Arg408*) | |
| #4 | 30 | M | c.2966T > C (p.Val989Ala) | |
| #5 | 71 | M | c.1648G > A (p.Gly550Arg) | |
| #6 | 14 | M | c.4918C > T (p.Arg1640Trp) | |
| #7 | 39 | F | c.2123T > C (p.Met708Thr) | |
| #8 | 41 | F | c.3322C > T (p.Arg1108Cys) | |
| #9 | 68 | F | c.1015T > G (p.Trp339Gly) | |
| #10 | 56 | M | c.2966T > C (p.Val989Ala) | |
| #11 | 25 | F | c.1018T > C (p.Tyr340His) | |
| #12 | 25 | F | c.5018 + 2T > C(IVS35 + 2T > C) | |
| #13 | 66 | F | c.4685T > C (p.Ile1562Thr) | |
| #14 | 44 | M | c.452T > C(p.Ile151Thr) ‡ | |
| #15 | 71 | M | c.1671T > A (p.Tyr557*) | |
| #16 | 17 | M | c.3813G > C (p.Glu1271Asp) ‡ | |
| #17 | 37 | M | c.5363C > T (p.Pro1788Leu) | |
| #18 | 45 | F | c.5885G > A (p.Gly1961Glu) | |
| #19 | 17 | M | c.1015T > G (p.Trp339Gly) | |
| #20 | 63 | F | c.5603A > T (p.Asn1868Ile) | |
| #21 | 50 | M | c.3113C > T (p.Ala1038Val) | |
| #22 | 75 | F | c.455G > A (p.Arg152Gln) ‡, | |
| #23 | 43 | M | c.514G > A (p.Gly172Ser) ‡, | |
| #24 | 64 | F | c.1749G > (p.Lys583Asn) | |
| #25 | 73 | M | c.3916delinsGT (p.Pro1306Valfs*116) | |
| #26 | 25 | M | c.1749G > C (p.Lys583Asn) | |
| #27 | 43 | M | c.1749G > C (p.Lys583Asn) | |
| #28 | 87 | F | c.735T > G (p.Tyr245*) | |
| #29 | 38 | F | c.3813G > C (p.Glu1271Asp) | |
| #30 | 70 | F | c.5363C > T (p.Pro1788Leu) | |
| #31 | 37 | M | c.1749G > C (p.Lys583Asn) | |
| #32 | 67 | M | c.5885G > A (p.Gly1961Glu) | |
| #33 | 68 | F | c.769–784C > T (p.Leu257Aspfs*3) ‡ | |
| #34 | 43 | M | c.4070C > T (p.Ala1357Val) | |
| #35 | 51 | F | c.1804C > T (p.Arg602Trp) | |
| #36 | 49 | F | c.1804C > T (p.Arg602Trp) | |
| #37 | 25 | F | c.634C > T (p.Arg212Cys) | |
| #38 | 55 | M | c.5315G > A (p.Trp1772*) | |
| #39 | 20 | F | c.1018T > C (p.Tyr340His) | |
| #40 | 54 | M | c.1018T > C (p.Tyr340His) |
NA: not available; ‡: variant.
Loss, accuracy, and AUROC on the training, validation, and test sets.
| Loss | Accuracy | AUROC | ||
|---|---|---|---|---|
| ResNet50V2 | Training set | 0.342 | 0.869 | 0.925 |
| Validation set | 0.6383 | 0.769 | 0.837 | |
| Test set | 0.413 | 0.882 | 0.892 | |
Figure 2Example of correct attribution and integrated gradient visualization. Upper and middle panels: Stargardt disease (STGD1) correctly classified. Lower panels: Pseudo-Stargardt Pattern Dystrophy (PSPD) correctly classified.
Figure 3Example of incorrect attribution and integrated gradient visualization. Upper panels: Pseudo-Stargardt Pattern Dystrophy (PSPD), classified as Stargardt disease by the CNN model. Lower panels: Pseudo-Stargardt Pattern Dystrophy (PSPD), classified as Stargardt disease by the CNN model.
Retina specialists’ performances in distinguishing Stargardt disease (STGD1) from pseudo-Stargardt pattern dystrophy (PSPD) using solely the FAF imaging.
| Accuracy | Sensitivity (Recall) | Specificity | |
|---|---|---|---|
| Retina expert | 0.816 | 0.790 | 0.801 |
| Retina fellow | 0.724 | 0.595 | 0.590 |