| Literature DB >> 36188482 |
Yu Liu1,2, Fuhao Mu1, Yu Shi1, Juan Cheng1,2, Chang Li1,2, Xun Chen3.
Abstract
Brain tumor segmentation in multimodal MRI volumes is of great significance to disease diagnosis, treatment planning, survival prediction and other relevant tasks. However, most existing brain tumor segmentation methods fail to make sufficient use of multimodal information. The most common way is to simply stack the original multimodal images or their low-level features as the model input, and many methods treat each modality data with equal importance to a given segmentation target. In this paper, we introduce multimodal image fusion technique including both pixel-level fusion and feature-level fusion for brain tumor segmentation, aiming to achieve more sufficient and finer utilization of multimodal information. At the pixel level, we present a convolutional network named PIF-Net for 3D MR image fusion to enrich the input modalities of the segmentation model. The fused modalities can strengthen the association among different types of pathological information captured by multiple source modalities, leading to a modality enhancement effect. At the feature level, we design an attention-based modality selection feature fusion (MSFF) module for multimodal feature refinement to address the difference among multiple modalities for a given segmentation target. A two-stage brain tumor segmentation framework is accordingly proposed based on the above components and the popular V-Net model. Experiments are conducted on the BraTS 2019 and BraTS 2020 benchmarks. The results demonstrate that the proposed components on both pixel-level and feature-level fusion can effectively improve the segmentation accuracy of brain tumors.Entities:
Keywords: brain tumor segmentation; convolutional neural networks; feature-level fusion; medical image fusion; pixel-level fusion
Year: 2022 PMID: 36188482 PMCID: PMC9515796 DOI: 10.3389/fnins.2022.1000587
Source DB: PubMed Journal: Front Neurosci ISSN: 1662-453X Impact factor: 5.152
Figure 1An example of multimodal MRI volumes for brain tumor segmentation. The green, red, and yellow regions in the ground truth indicate edema (ED), non-enhancing tumor and necrosis (NCR/NET), and enhancing tumor (ET), respectively.
Figure 2The schematic diagram of the proposed brain tumor segmentation framework.
Figure 3The architecture of our PIF-Net for 3D multimodal MR image fusion.
Detailed parameter configuration of the PIF-Net.
|
|
|
|
|
|
|
|
|---|---|---|---|---|---|---|
| Conv1 | 3 × 3 × 3 | 1 | 1 | 1 | 32 | ReLU |
| Conv2 | 3 × 3 × 3 | 1 | 1 | 1 | 32 | ReLU |
| Conv3-1 | 3 × 3 × 3 | 1 | 1 | 32 | 32 | ReLU |
| Conv3-2 | 3 × 3 × 3 | 1 | 1 | 32 | 32 | / |
| Addition | / | / | / | 32 | 32 | ReLU |
| Conv4-1 | 3 × 3 × 3 | 1 | 1 | 32 | 32 | ReLU |
| Conv4-2 | 3 × 3 × 3 | 1 | 1 | 32 | 32 | / |
| Addition | / | / | / | 32 | 32 | ReLU |
| Conv5-1 | 3 × 3 × 3 | 1 | 1 | 64 | 64 | ReLU |
| Conv5-2 | 3 × 3 × 3 | 1 | 1 | 64 | 64 | / |
| Addition | / | / | / | 64 | 64 | ReLU |
| Conv6 | 3 × 3 × 3 | 1 | 1 | 64 | 32 | / |
| Conv7 | 3 × 3 × 3 | 1 | 1 | 32 | 1 | / |
| Sigmoid | / | / | / | 1 | 1 | / |
| Weighted average | / | / | / | 1 | 1 | / |
K, S, P, I, O, and A denote the kernel size, stride, padding size, number of input channels, number of output channels, and activation operation, respectively.
Figure 4An example of fusion results obtained by different 3D medical image fusion methods.
Figure 5The architecture of our MSFF module for multimodal feature refinement.
Figure 6Impact of the parameters α and λ on the model performance.
Objective evaluation results for the ablation study of the proposed method on the BraTS 2019 validation sets.
|
|
|
|
|
|
|
|---|---|---|---|---|---|
| WT | Dice | 0.8635 | 0.8771 | 0.8832 |
|
| Hausdorff | 7.1211 | 7.7784 | 7.1654 |
| |
| TC | Dice | 0.7788 | 0.8065 | 0.8045 |
|
| Hausdorff | 15.7345 |
| 14.4599 | 10.8988 | |
| ET | Dice | 0.7682 | 0.7698 | 0.7692 |
|
| Hausdorff | 9.1385 |
| 6.4719 | 5.8548 | |
| Average | Dice | 0.8035 | 0.8178 | 0.8190 |
|
| Hausdorff | 10.6647 | 7.7587 | 9.3657 |
|
Bold values indicate the best-performing scores on each metric (each row in the tables) among all the four models.
Objective evaluation results for the ablation study of the proposed method on the BraTS 2020 validation sets.
|
|
|
|
|
|
|
|---|---|---|---|---|---|
| WT | Dice | 0.8678 | 0.8725 | 0.8878 |
|
| Hausdorff | 11.5732 | 9.6274 | 7.8896 |
| |
| TC | Dice | 0.8025 | 0.8153 | 0.8139 |
|
| Hausdorff | 11.6728 | 10.4340 | 10.9337 |
| |
| ET | Dice | 0.7631 | 0.7730 | 0.7678 |
|
| Hausdorff | 6.9469 | 5.9442 | 7.1674 |
| |
| Average | Dice | 0.8111 | 0.8203 | 0.8232 |
|
| Hausdorff | 10.0643 | 8.6685 | 8.6636 |
|
Bold values indicate the best-performing scores on each metric (each row in the tables) among all the four models.
Figure 7Examples of brain tumor segmentation results obtained by di3fferent methods in the ablation study. The green, red, and yellow regions indicate edema (ED), non-enhancing tumor and necrosis (NCR/NET), and enhancing tumor (ET), respectively.
Objective evaluation results of different brain tumor segmentation methods on the BraTS 2019 validation sets.
|
|
|
|
|
| ||||
|---|---|---|---|---|---|---|---|---|
|
|
|
|
|
|
|
|
| |
| Xu et al. ( | 0.8930 | 6.9640 | 0.8070 | 7.6630 | 0.7590 |
| 0.8197 |
|
| Baid et al. ( | 0.8700 | 13.3600 | 0.7700 | 12.7100 | 0.7000 | 6.4500 | 0.7800 | 10.8400 |
| González et al. ( | 0.8882 | 8.1231 | 0.7833 |
| 0.7231 |
| 0.7982 | 6.8660 |
| Lorenzo et al. ( | 0.8904 | - | 0.7511 | - | 0.6634 | - | 0.7683 | - |
| Ahmad et al. ( | 0.8518 | 9.0083 | 0.7576 | 10.6744 | 0.6230 | 8.4683 | 0.7441 | 9.3837 |
| Abraham and Khan ( | 0.8605 | - | 0.7108 | - | 0.6323 | - | 0.7345 | - |
| Bhalerao and Thakur ( | 0.8527 | 8.0793 | 0.7091 | 9.5708 | 0.6668 | 7.2700 | 0.7429 | 8.3067 |
| Yan et al. ( | 0.8600 | 40.3100 | 0.7300 | 10.4000 | 0.6600 | 18.5300 | 0.7500 | 23.0800 |
| Iantsen et al. ( | 0.8700 | 8.3500 | 0.7900 | 9.5800 | 0.6700 | 7.8200 | 0.7767 | 8.5833 |
| Astaraki et al. ( | 0.8700 |
| 0.8100 |
| 0.7100 | 6.0200 | 0.7967 |
|
| Cao et al. ( |
| 7.5050 | 0.7875 | 9.2600 |
| 6.9250 | 0.8221 | 7.8967 |
| Wang et al. ( | 0.8889 | 7.5990 |
| 7.5840 |
| 5.9080 |
| 7.0303 |
| Valanarasu et al. ( | 0.8760 | 8.9420 | 0.7392 | 9.8930 | 0.7321 | 6.3230 | 0.7824 | 8.3860 |
| OURS |
|
|
| 10.8988 | 0.7710 | 5.8548 |
| 7.3675 |
Bold and underlined values indicate the best scores and second best scores on each metric (each column in the tables) among all the methods.
Objective evaluation results of different brain tumor segmentation methods on the BraTS 2020 validation sets.
|
|
|
|
|
| ||||
|---|---|---|---|---|---|---|---|---|
|
|
|
|
|
|
|
|
| |
| Jun et al. ( | 0.8780 | 6.3000 | 0.7790 | 11.0200 | 0.7520 | 30.6500 | 0.8030 | 15.9900 |
| Liu et al. ( | 0.8823 | 6.4900 | 0.8012 |
| 0.7637 | 21.3900 | 0.8157 | 11.5200 |
| Messaoudi et al. ( | 0.8413 | - | 0.6804 | - | 0.6537 | - | 0.7251 | - |
| Sun et al. ( | 0.8920 | - | 0.7880 | - | 0.7230 | - | 0.8010 | - |
| Cirillo et al. ( | 0.8926 | 6.3900 | 0.7919 | 14.0700 | 0.7504 | 36.0000 | 0.8116 | 18.8200 |
| Pang et al. ( | 0.8811 | 18.0901 | 0.7605 | 29.0570 | 0.7538 | 34.2391 | 0.7985 | 27.1287 |
| Sundaresan et al. ( | 0.8900 |
| 0.7700 | 15.3000 | 0.7700 | 29.4000 | 0.8100 | 16.3667 |
| Ballestar and Vilaplana ( | 0.8300 | 12.3400 | 0.7700 | 13.1100 | 0.7200 | 37.4200 | 0.7733 | 20.9567 |
| McHugh et al. ( | 0.8810 | 6.7200 | 0.7890 | 10.2000 | 0.7120 | 40.6000 | 0.7940 | 19.1733 |
| Ma et al. ( | 0.8794 | - | 0.7731 | - | 0.7040 | - | 0.7855 | - |
| Cao et al. ( |
| 7.855 | 0.7760 | 14.5940 |
|
| 0.8196 |
|
| Wang et al. ( | 0.8900 | 6.4690 |
| 10.4680 |
| 16.7160 |
| 11.2177 |
| Zhang et al. ( | 0.8800 | 6.9500 | 0.7400 | 30.1800 | 0.7000 | 38.6000 | 0.7733 | 25.2433 |
| OURS |
|
|
|
| 0.7745 |
|
|
|
Bold and underlined values indicate the best scores and second best scores on each metric (each column in the tables) among all the methods.