| Literature DB >> 35512608 |
Song Zhang1, Yangfan Zhou2, Dehua Tang3, Muhan Ni3, Jinyu Zheng4, Guifang Xu1, Chunyan Peng1, Shanshan Shen1, Qiang Zhan5, Xiaoyun Wang6, Duanmin Hu7, Wu-Jun Li8, Lei Wang9, Ying Lv10, Xiaoping Zou11.
Abstract
BACKGROUND: We aimed to develop a deep learning-based segmentation system for rapid on-site cytopathology evaluation (ROSE) to improve the diagnostic efficiency of endoscopic ultrasound-guided fine-needle aspiration (EUS-FNA) biopsy.Entities:
Keywords: Deep convolutional neural network; EUS-FNA; Pancreatic mass; Rapid on-site cytopathology evaluation
Mesh:
Year: 2022 PMID: 35512608 PMCID: PMC9079232 DOI: 10.1016/j.ebiom.2022.104022
Source DB: PubMed Journal: EBioMedicine ISSN: 2352-3964 Impact factor: 11.205
Baseline characteristics of the training, validation, and testing datasets.
| Total, | Training dataset, | Validation dataset, | Internal testing dataset, | External testing datasets | |||
|---|---|---|---|---|---|---|---|
| WXPH, | WXSPH, | SAHSC, | |||||
| Age (year), mean±sd | 62·1 ± 11·6 | 63·8 ± 10·2 | 59·5 ± 14·4 | 62·1 ± 12·5 | 60·0 ± 11·2 | 64·6 ± 10·5 | 59·9 ± 13·2 |
| Sex, | |||||||
| Male | 116 (59·8%) | 34 (51·5%) | 12 (75·0%) | 14 (51·9%) | 21 (70·0%) | 17 (68·0%) | 18 (60·0%) |
| Female | 78 (40·2%) | 32 (48·5%) | 4 (25·0%) | 13 (48·1%) | 9 (30·0%) | 8 (32·0%) | 12 (40·0%) |
| Size (cm), mean±sd | 3·3 ± 1·1 | 3·2 ± 1·1 | 3·3 ± 0·9 | 3·2 ± 1·1 | 3·3 ± 1·0 | 3·7 ± 1·7 | 3·2 ± 1·0 |
| Location, | |||||||
| Head | 114 (58·8%) | 41 (62·1%) | 6 (37·5%) | 16 (59·3%) | 19 (63·3%) | 13 (52·0%) | 19 (63·3%) |
| Body | 45 (23·2%) | 14 (21·2%) | 7 (43·8%) | 7 (25·9%) | 7 (23·3%) | 5 (20·0%) | 5 (16·7%) |
| Tail | 35 (18·0%) | 11 (16·7%) | 3 (18·8%) | 4 (14·8%) | 4 (13·3%) | 7 (28·0%) | 6 (20·0%) |
| Cytopathological diagnosis, | |||||||
| No tumors | 47 (24·2%) | 17 (25·8%) | 3 (18·8%) | 5 (18·5%) | 9 (30·0%) | 4 (16·0%) | 9 (30·0%) |
| Mild atypia | 23 (11·9%) | 7 (10·6%) | 2 (12·5%) | 3 (11·1%) | 4 (13·3%) | 4 (16·0%) | 3 (10·0%) |
| Cancer | 113 (58·2%) | 40 (60·6%) | 10 (62·5%) | 17 (63·0%) | 14 (46·7%) | 16 (64·0%) | 16 (53·3%) |
| Other tumors | 11 (5·7%) | 2 (3·0%) | 1 (6·2%) | 2 (7·4%) | 3 (10·0%) | 1 (4·0%) | 2 (6·7%) |
WXPH, Wuxi People's Hospital; WXSPH, Wuxi Second People's Hospital; SAHSC, The Second Affiliated Hospital of Soochow University.
No tumors included chronic pancreatitis (n = 37) or autoimmune pancreatitis (n = 10).
Other tumors included pancreatic neuroendocrine tumors (n = 7) and solid pseudopapillary tumors (n = 4).
Figure 1The flow chart of study design. (a) All the images of cytopathological slides with original resolutions were resized to a standard resolution (1024 × 1024 pixels). (b) The resized images were annotated with delineation along the margin of the cell clusters and addition of the corresponding labels on the delineated cell clusters. (c) A deep convolutional neural network model was developed using the training dataset and optimal hyperparameters were selected with the validation dataset. (d) The performance of the established deep-learning system was evaluated using the internal and external testing datasets and then compared with those of endoscopists and cytopathologists. DCNN=Deep convolutional neural network. AI=Artificial intelligence.
Figure 2Visualization of DCNN performance in segmenting pancreatic cancer cell clusters and noncancer cell clusters. (a) Representative prediction results of the DCNN system for pancreatic cancer cell cluster segmentation. (b) Representative prediction results of the DCNN system for noncancer cell cluster segmentation. (c) Representative prediction results of the DCNN system for pancreatic cancer and noncancer cell cluster segmentation in a single visual field. DCNN=Deep convolutional neural network. IoU=Intersection over Union.
Figure 3DCNN performance in identifying pancreatic cancer on internal and external testing datasets. (a) Image-level ROC curves for pancreatic cancer on the internal testing dataset. (b) Image-level receiver operating characteristic curves for pancreatic cancer on the external testing datasets. (c) Patient-level receiver operating characteristic curves for pancreatic cancer on the internal testing dataset. (d) Patient-level receiver operating characteristic curves for pancreatic cancer on the external testing datasets. Image-level receiver operating characteristic curves for pancreatic cancer in the subgroup analysis of the internal testing dataset according to age (e), sex (f), lesion location (g), and lesion size (h). DCNN= Deep convolutional neural network. AUC=Area under the receiver operating characteristic curve. NJDTH=Nanjing Drum Tower Hospital. WXPH=Wuxi People's Hospital. WXSPH=Wuxi Second People's Hospital. SAHSC=The Second Affiliated Hospital of Soochow University.
Figure 4Comparison of the diagnostic performance of the DCNN system and doctors. (a) Comparison of the diagnostic performance of the DCNN system, trained endoscopists, and cytopathologists on human-machine competition dataset 1. (b) Comparison of the diagnostic performance of the DCNN system, trained endoscopists, and cytopathologists on human-machine competition dataset 2. DCNN=Deep convolutional neural network. AUC=Area under the receiver operating characteristic curve.