| Literature DB >> 35683590 |
Yibiao Rong1,2, Zehua Jiang3,4, Weihang Wu1,2, Qifeng Chen1,2, Chuliang Wei1,2, Zhun Fan1,2, Haoyu Chen3,4.
Abstract
Automatic and accurate estimation of choroidal thickness plays a very important role in a computer-aided system for eye diseases. One of the most common methods for automatic estimation of choroidal thickness is segmentation-based methods, in which the boundaries of the choroid are first detected from optical coherence tomography (OCT) images. The choroidal thickness is then computed based on the detected boundaries. A shortcoming in the segmentation-based methods is that the estimating precision greatly depends on the segmentation results. To avoid the dependence on the segmentation step, in this paper, we propose a direct method based on convolutional neural networks (CNNs) for estimating choroidal thickness without segmentation. Concretely, a B-scan image is first cropped into several patches. A trained CNN model is then used to estimate the choroidal thickness for each patch. The mean thickness of the choroid in the B-scan is obtained by taking the average of the choroidal thickness on each patch. Then, 150 OCT volumes are collected to evaluate the proposed method. The experiments show that the results obtained by the proposed method are very competitive with those obtained by segmentation-based methods, which indicates that direct estimation of choroidal thickness is very promising.Entities:
Keywords: choroidal thickness; convolutional neural networks; direct estimation; optical coherence tomography
Year: 2022 PMID: 35683590 PMCID: PMC9181751 DOI: 10.3390/jcm11113203
Source DB: PubMed Journal: J Clin Med ISSN: 2077-0383 Impact factor: 4.964
Figure 1The flowchart of the proposed method for estimation of choroidal thickness. A B-scan image was first cropped into several patches. A trained CNN model was then used to estimate the choroidal thickness for each patch. The mean thickness of the choroid in the B-scan was obtained by taking the average of the choroidal thickness on each patch. m and n are the width and height of a patch p respectively.
Figure 2The process for cropping images. (a) sliding in the vertical direction; (b) sliding in the horizontal direction. m and n are the width and height of a patch p respectively.
Figure 3The network architecture applied in this work.
Figure 4Examples to demonstrate the predicted results obtained by the proposed method, in which yellow parts are the results obtained by the proposed method. White parts are the ground truth. Green curves are the boundaries of choroid. (a) MAE = 0.5884, (b) MAE = 4.0118, (c) MAE = 17.1611, (d) MAE = 0.0307, (e) MAE = 8.4348, (f) MAE = 0.9919.
The obtained results at patch and B-scan levels with different patch sizes.
| Patch Level | B−Scan Level | |||||
|---|---|---|---|---|---|---|
| Patch Size | MAE (Pixels) | ME (Pixels) | PCC | MAE (Pixels) | ME (Pixels) | PCC |
| 200 × 50 | 7.1197 ± 9.1915 | −1.7032 ± 11.5010 | 0.8644 | 4.5871 ± 6.4419 | −1.7032 ± 7.7228 | 0.9193 |
| 200 × 100 | 8.1371 ± 10.1308 | −1.3165 ± 12.9272 | 0.8225 | 5.6696 ± 7.4609 | −1.3165 ± 9.2779 | 0.8801 |
| 200 × 200 | 6.0435 ± 7.5231 | −1.3090 ± 9.5608 | 0.9041 | 4.3001 ± 5.8070 | −1.3090 ± 7.1064 | 0.9274 |
| 400 × 400 | 5.8868 ± 7.5539 | 0.5584 ± 9.5607 | 0.8901 | 4.6766 ± 6.3843 | 0.5584 ± 7.8944 | 0.8969 |
Figure 5The scatterplots in patch level: (a) the patch size is 200 × 50; (b) 200 × 100; (c) 200 × 200; (d) 400 × 400.
The percentage of patches in different MAE intervals.
| Patch Size | MAE ≤ 5 | 5 < MAE ≤ 10 | 10 < MAE ≤ 15 | MAE > 15 |
|---|---|---|---|---|
| 200 × 50 | 58.08% | 21.59% | 8.15% | 12.18% |
| 200 × 100 | 53.03% | 22.78% | 9.31% | 14.88% |
| 200 × 200 | 61.53% | 22.40% | 7.76% | 8.31% |
| 400 × 400 | 57.94% | 28.02% | 8.74% | 5.29% |
Figure 6The scatterplots in B-scan level: (a) rhe patch size is 200 × 50; (b) 200 × 100; (c) 200 × 200; (d) 400 × 400.
The percentage of B-scans in different MAE intervals.
| Patch Size | MAE ≤ 5 | 5 < MAE ≤ 10 | 10 < MAE ≤ 15 | MAE > 15 |
|---|---|---|---|---|
| 200 × 50 | 74.01% | 15.74% | 4.45% | 5.80% |
| 200 × 100 | 66.06% | 18.71% | 6.58% | 8.65% |
| 200 × 200 | 75.30% | 15.56% | 4.83% | 4.77% |
| 400 × 400 | 69.68% | 23.10% | 4.01% | 3.20% |
Comparison with segmentation based methods.
| Methods | BM (Pixels) | CSI (Pixels) | MAE (Pixels) |
|---|---|---|---|
| Graph Cut [ | 2.4800 | 9.7900 | - |
| SCA-CENet [ | 1.8640 | 8.5310 | - |
| Bio-Net [ | 0.7700 | 4.3100 | 4.300 |
| U-Net [ | 1.4500 | 5.8700 | 5.4800 |
| Proposed | - | - | 4.3001 |