Literature DB >> 33693968

Segmentation of white matter hyperintensities on 18F-FDG PET/CT images with a generative adversarial network.

Kyeong Taek Oh1, Dongwoo Kim2, Byoung Seok Ye3, Sangwon Lee2, Mijin Yun4, Sun Kook Yoo5.   

Abstract

PURPOSE: White matter hyperintensities (WMH) are typically segmented using MRI because WMH are hardly visible on 18F-FDG PET/CT. This retrospective study was conducted to segment WMH and estimate their volumes from 18F-FDG PET with a generative adversarial network (WhyperGAN).
METHODS: We selected patients whose interval between MRI and FDG PET/CT scans was within 3 months, from January 2017 to December 2018, and classified them into mild, moderate, and severe groups by following the semiquantitative rating method of Fazekas. For each group, 50 patients were selected, and of them, we randomly selected 35 patients for training and 15 for testing. WMH were automatically segmented from FLAIR MRI with manual adjustment. Patches of WMH were extracted from 18F-FDG PET and segmented MRI. WhyperGAN was compared with H-DenseUnet, a deep learning method widely used for segmentation tasks, for segmentation performance based on the dice similarity coefficient (DSC), recall, and average volume differences (AVD). For volume estimation, the predicted WMH volumes from PET were compared with ground truth volumes.
RESULTS: The DSC values were associated with WMH volumes on MRI. For volumes >60 mL, the DSC values were 0.751 for WhyperGAN and 0.564 for H-DenseUnet. For volumes ≤60 mL, the DSC values rapidly decreased as the volume decreased (0.362 for WhyperGAN vs. 0.237 for H-DenseUnet). For recall, WhyperGAN achieved the highest value in the severe group (0.579 for WhyperGAN vs. 0.509 for H-DenseUnet). For AVD, WhyperGAN achieved the lowest score in the severe group (0.494 for WhyperGAN vs. 0.941 for H-DenseUnet). For the WMH volume estimation, WhyperGAN performed better than H-DenseUnet and yielded excellent correlation coefficients (r = 0.998, 0.983, and 0.908 in the severe, moderate, and mild group).
CONCLUSIONS: Although limited by visual analysis, the WhyperGAN based can be used to automatically segment and estimate volumes of WMH from 18F-FDG PET/CT. This would increase the usefulness of 18F-FDG PET/CT for the evaluation of WMH in patients with cognitive impairment.

Entities:  

Keywords:  18F-FDG PET/CT; Feasibility study; Generative adversarial network; Segmentation; White matter hyperintensities

Year:  2021        PMID: 33693968     DOI: 10.1007/s00259-021-05285-4

Source DB:  PubMed          Journal:  Eur J Nucl Med Mol Imaging        ISSN: 1619-7070            Impact factor:   9.236


  1 in total

1.  FULLY CONVOLUTIONAL NETWORKS FOR MULTI-MODALITY ISOINTENSE INFANT BRAIN IMAGE SEGMENTATION.

Authors:  Dong Nie; Li Wang; Yaozong Gao; Dinggang Shen
Journal:  Proc IEEE Int Symp Biomed Imaging       Date:  2016
  1 in total
  3 in total

Review 1.  Applications of Generative Adversarial Networks (GANs) in Positron Emission Tomography (PET) imaging: A review.

Authors:  Ioannis D Apostolopoulos; Nikolaos D Papathanasiou; Dimitris J Apostolopoulos; George S Panayiotakis
Journal:  Eur J Nucl Med Mol Imaging       Date:  2022-04-22       Impact factor: 10.057

2.  Gross Tumor Volume Definition and Comparative Assessment for Esophageal Squamous Cell Carcinoma From 3D 18F-FDG PET/CT by Deep Learning-Based Method.

Authors:  Yaoting Yue; Nan Li; Husnain Shahid; Dongsheng Bi; Xin Liu; Shaoli Song; Dean Ta
Journal:  Front Oncol       Date:  2022-03-17       Impact factor: 6.244

3.  18F-FDG-PET correlates of aging and disease course in ALS as revealed by distinct PVC approaches.

Authors:  Pilar M Ferraro; Cristina Campi; Alberto Miceli; Claudia Rolla-Bigliani; Matteo Bauckneht; Lorenzo Gualco; Michele Piana; Cecilia Marini; Lucio Castellan; Silvia Morbelli; Claudia Caponnetto; Gianmario Sambuceti; Luca Roccatagliata
Journal:  Eur J Radiol Open       Date:  2022-01-13
  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.