| Literature DB >> 36157332 |
Zhengfeng Lai1, Luca Cerny Oliveira1, Runlin Guo1, Wenda Xu1, Zin Hu2, Kelsey Mifflin2, Charles Decarli2, Sen-Ching Cheung3, Chen-Nee Chuah1, Brittany N Dugger2.
Abstract
As neurodegenerative disease pathological hallmarks have been reported in both grey matter (GM) and white matter (WM) with different density distributions, automating the segmentation process of GM/WM would be extremely advantageous for aiding in neuropathologic deep phenotyping. Standard segmentation methods typically involve manual annotations, where a trained researcher traces the delineation of GM/WM in ultra-high-resolution Whole Slide Images (WSIs). This method can be time-consuming and subjective, preventing a scalable analysis on pathology images. This paper proposes an automated segmentation pipeline (BrainSec) combining a Convolutional Neural Network (CNN) module for segmenting GM/WM regions and a post-processing module to remove artifacts/residues of tissues. The final output generates XML annotations that can be visualized via Aperio ImageScope. First, we investigate two baseline models for medical image segmentation: FCN, and U-Net. Then we propose a patch-based approach, BrainSec, to classify the GM/WM/background regions. We demonstrate BrainSec is robust and has reliable performance by testing it on over 180 WSIs that incorporate numerous unique cases as well as distinct neuroanatomic brain regions. We also apply gradient-weighted class activation mapping (Grad-CAM) to interpret the segmentation masks and provide relevant explanations and insights. In addition, we have integrated BrainSec with an existing Amyloid-β pathology classification model into a unified framework (without incurring significant computation complexity) to identify pathologies, visualize their distributions, and quantify each type of pathologies in segmented GM/WM regions, respectively.Entities:
Keywords: Alzheimer’s disease; Neuropathology; convolutional neural network; dementia; machine learning; medical image analysis
Year: 2022 PMID: 36157332 PMCID: PMC9503016 DOI: 10.1109/access.2022.3171927
Source DB: PubMed Journal: IEEE Access ISSN: 2169-3536 Impact factor: 3.476
FIGURE 1.WSIs annotated by two trained personnel. (Green and yellow colors denote the two independent annotations.)
FIGURE 2.Sample artifacts in WSIs.
Summary of the three datasets used in the study.
| Dataset | Number of cases | Brain Areas | |
|---|---|---|---|
| 30-WSI | 30 | 40.00% | Temporal |
| 130-WSI | 130 | 10.00% | Temporal |
| 52-Heteroarea | 26 | 15.38% | Frontal, Parietal |
FIGURE 3.Segmentation pipeline and output of XML annotations. The red box indicates the tissue fragments that needs to be removed by the post-processing module.
FIGURE 4.NCRF architecture.
FIGURE 5.Overview of a pathology quantification model and visualization framework from Tang et al. [12] (indicated by yellow oval) integrated with BrainSec. Aβ in the figure refers to Amyloid-β.
FIGURE 7.Segmentation masks visualization: WSI-16 is a AD case (top panels) and WSI-30 is a NAD case (bottom panels). GM, WM, and background are indicated by cyan, yellow, and black, respectively. The red box display areas where U-Net struggled to correctly segment tissue. In the top WSI, U-Net has trouble differentiating between WM/GM. In the bottom WSI, U-Net has trouble segmenting tissue from background.
Pixel-wise IoU comparison on the hold-out test set.
| FCN [ | U-Net [ | BrainSec | BrainSec+ | |
|---|---|---|---|---|
| AD Back ± STD | 75.46 ± 11.0 | 96.79 ± 0.67 | 96.03 ± 2.31 | |
| AD GM ± STD | 71.22 ± 11.7 | 89.01 ± 4.92 | 91.10 ± 2.65 | |
| AD WM ± STD | 54.69 ± 16.0 | 79.42 ± 7.98 | 82.81 ± 4.44 | 84.97 ± 4.55 |
| AD ± STD | 67.12 ± 12.0 | 88.58 ± 4.24 | 90.23 ± 2.29 | 91.09 ± 2.91 |
| AD F_W ± STD | 70.48 ± 9.68 | 90.91 ± 2.52 | 91.92 ± | 92.45 ± 2.74 |
| NAD Back ± STD | 78.63 ± 3.01 | 97.15 ± 3.30 | 97.08 ± 1.42 | |
| NAD GM ± STD | 72.47 ± 2.91 | 93.75 ± 3.71 | 94.35 ± 1.97 | 94.24 ± 1.85 |
| NAD WM ± STD | 40.70 ± 13.3 | 78.68 ± 8.61 | 79.19 ± | |
| NAD ± STD | 63.93 ± 3.54 | 89.86 ± 3.15 | 90.34 ± 2.21 | |
| NAD F_W ± STD | 73.05 ± 2.65 | 94.31 ± 3.72 | 94.63 ± 2.40 | 95.11 ± 2.20 |
| Test Back ± STD | 76.73 ± 8.57 | 97.06 ± | 96.45 ± 1.95 | |
| Test GM ± STD | 71.72 ± 8.93 | 90.91 ± 4.90 | 92.40 ± 2.83 | |
| Test WM ± STD | 49.09 ± 15.9 | 79.12 ± 7.76 | 81.36 ± 5.42 | |
| Test ± STD | 65.85 ± 9.33 | 89.09 ± 3.71 | 90.27 ± 2.13 | 91.26 ± 1.99 |
| Test F_W ± STD | 71.51 ± 7.58 | 92.27 ± 3.35 | 93.00 ± 2.62 | 93.55 ± 2.68 |
Rows marked AD contain results on Alzheimer’s disease slides in the hold-out test set. Rows marked NAD contain results on non-Alzheimer’s disease slides in test set. Rows marked Test contain results on all hold-out test WSIs. F_W is the frequency weighted IoU score for three categories.
DICE coefficient comparison on the hold-out test set.
| Region | FCN [ | U-Net [ | BrainSec | BrainSec+ |
|---|---|---|---|---|
| GM | 83.21 | 95.17 | 96.03 | 96.18 |
| WM | 66.08 | 88.15 | 89.63 | 90.25 |
F1-score comparison on the hold-out test set.
| Region | FCN [ | U-Net [ | BrainSec | BrainSec+ |
|---|---|---|---|---|
| GM | 85.43 | 95.24 | 96.21 | 96.99 |
| WM | 69.21 | 88.98 | 90.01 | 91.31 |
FIGURE 6.Training process comparison: visualization of the trends of training and validation comparing BrainSec/BrainSec+ with U-Net.
FIGURE 8.With the integration of BrainSec into the plaque identification model [12], the distribution of plaques (denoting in orange in c-d) in GM/WM can be visualized and quantified.
FIGURE 9.Grad-CAM on selected patches from both AD and NAD cases.
FIGURE 10.Results from running BrainSec on Frontal (top) and Parietal (bottom) WSIs.