| Literature DB >> 31452990 |
Yanan Ruan1,2, Jie Xue1,3,2,4, Tianlai Li1, Danhua Liu1, Hua Lu1, Meirong Chen5, Tingting Liu6, Sijie Niu7,8, Dengwang Li1.
Abstract
As a function of the spatial position of the optical coherence tomography (OCT) image, retinal layer thickness is an important diagnostic indicator for many retinal diseases. Reliable segmentation of the retinal layer is necessary for extracting useful clinical information. However, manual segmentation of these layers is time-consuming and prone to bias. Furthermore, due to speckle noise, low image contrast, retinal detachment, and also irregular morphological features make the automatic segmentation task challenging. To alleviate these challenges, in this paper, we propose a new coarse-fine framework combining the full convolutional network (FCN) with a multiphase level set (named FCN-MLS) for automatic segmentation of nine boundaries in retinal spectral OCT images. In the coarse stage, FCN is used to learn the characteristics of specific retinal layer boundaries and achieve classification of four retinal layers. The boundaries are then extracted and the remaining boundaries are initialized based on a priori information about the thickness of the retinal layer. In order to prevent the overlapping of the segmentation interfaces, a regional restriction technique is used in the multi-phase level to evolve the boundaries to achieve fine nine retinal layers segmentation. Experimental results on 1280 B-scans show that the proposed method can segment nine retinal boundaries accurately. Compared with the manual delineation, the overall mean absolute boundary location difference and the overall mean absolute thickness difference were 5.88 ± 2.38μm and 5.81 ± 2.19μm, which showed a good consistency with manual segmentation by the physicians. Our experimental results also outperform state-of-the-art methods.Entities:
Year: 2019 PMID: 31452990 PMCID: PMC6701532 DOI: 10.1364/BOE.10.003987
Source DB: PubMed Journal: Biomed Opt Express ISSN: 2156-7085 Impact factor: 3.732