| Literature DB >> 33808436 |
Jing Fang1,2, Xiaole Ma3, Jingjing Wang1,2, Kai Qin4, Shaohai Hu3, Yuefeng Zhao1,2.
Abstract
The unavoidable noise often present in synthetic aperture radar (SAR) images, such as speckle noise, negatively impacts the subsequent processing of SAR images. Further, it is not easy to find an appropriate application for SAR images, given that the human visual system is sensitive to color and SAR images are gray. As a result, a noisy SAR image fusion method based on nonlocal matching and generative adversarial networks is presented in this paper. A nonlocal matching method is applied to processing source images into similar block groups in the pre-processing step. Then, adversarial networks are employed to generate a final noise-free fused SAR image block, where the generator aims to generate a noise-free SAR image block with color information, and the discriminator tries to increase the spatial resolution of the generated image block. This step ensures that the fused image block contains high resolution and color information at the same time. Finally, a fused image can be obtained by aggregating all the image blocks. By extensive comparative experiments on the SEN1-2 datasets and source images, it can be found that the proposed method not only has better fusion results but is also robust to image noise, indicating the superiority of the proposed noisy SAR image fusion method over the state-of-the-art methods.Entities:
Keywords: generative adversarial networks; image fusion; nonlocal matching
Year: 2021 PMID: 33808436 PMCID: PMC8067251 DOI: 10.3390/e23040410
Source DB: PubMed Journal: Entropy (Basel) ISSN: 1099-4300 Impact factor: 2.524
Figure 1The network of the proposed method.
The details of the generator and discriminator.
| Layer | Filter | Normalization | Activation | ||
|---|---|---|---|---|---|
|
| Encoder | En_1 | 5*5 Conv (n64) | BN | Leaky ReLU |
| En_2 | 3*3 Conv (n128) | BN | Leaky ReLU | ||
| En_3-En_5 | 3*3 Conv (n256) | BN | Leaky ReLU | ||
| Decoder | De_1 | 3*3 Conv (n256) | BN | Leaky ReLU | |
| De_2 | 3*3 Conv (n128) | BN | Leaky ReLU | ||
| De_3 | 5*5 Conv (n1) | - | Sigmoid | ||
|
| D_1 | 3*3 Conv (n64) | BN | Leaky ReLU | |
| D_2 | 3*3 Conv (n128) | BN | Leaky ReLU | ||
| D_3 | 3*3 Conv (n256) | BN | Leaky ReLU | ||
| D_4 | 3*3 Conv (n1) | - | Sigmoid |
Figure 2Results of our experiments using the SEN1–2 datasets by the proposed method. The first column: noisy synthetic aperture radar (SAR) images; the second column: optical images; the third column: fused images. (a–c) Group 1; (d–f) Group 2; (g–i) Group 3; (j–l) Group 4.
Figure 3Compared source images in SEN1–2: the first column contains noisy SAR images, the second column contains de-noised SAR images, and the third column contains optical images. (a) Group 1; (b) Group 2; (c) Group 3; (d) Group 4.
Figure 4The fused images of Group 1 in Figure 3: (a) guided filtering (GFF); (b) sparse model (SR); (c) wavelet-based image fusion (DWT); (d) convolutional neural network (CNN); (e) multi-scale weighted gradient-based fusion (MWGF); (f) multi-scale transform and sparse representation (MST-SR); (g) nonsubsampled Shearlet transform domain (NSST); (h) generative adversarial network (GAN); (i) the proposed method.
Figure 5The fused images of Group 2 in Figure 3: (a) GFF; (b) SR; (c) DWT; (d) CNN; (e) MWGF; (f) MST-SR; (g) NSST; (h) GAN; (i) the proposed method.
Figure 6The fused images of Group 3 in Figure 3: (a) GFF; (b) SR; (c) DWT; (d) CNN; (e) MWGF; (f) MST-SR; (g) NSST; (h) GAN; (i) the proposed method.
Figure 7The fused images of Group 4 in Figure 3: (a) GFF; (b) SR; (c) DWT; (d) CNN; (e) MWGF; (f) MST-SR; (g) NSST; (h) GAN; (i) the proposed method.
Figure 8The valuable metrics of the fused images in Figure 3: (a) entropy (EN); (b) average gradient (AVG); (c) spatial frequency (SF); (d) mutual information (MI).
Objective indicators of generalization on 10 test images from the SEN1–2 datasets.
| EN | AVG | SF | MI | |
|---|---|---|---|---|
| GFF | 7.2607 ± 0.0132 | 10.0256 ± 0.0636 | 24.6531 ± 0.2501 | 4.8751 ± 0.0082 |
| SR | 7.2251 ± 0.0161 | 9.6989 ± 0.1087 | 23.6590 ± 0.4291 | 6.8254 ± 0.0147 |
| DWT | 7.2426 ± 0.0114 | 10.1984 ± 0.0332 | 25.2251 ± 0.2619 | 6.3567 ± 0.0258 |
| CNN | 7.2675 ± 0.0503 | 9.9878 ± 0.1401 | 23.2157 ± 0.4069 | 4.6531 ± 0.0074 |
| MWGF | 7.2475 ± 0.0335 | 10.1538 ± 0.0408 | 25.3621 ± 0.2585 | 6.4256 ± 0.0361 |
| MST-SR | 7.2659 ± 0.0354 | 10.1596 ± 0.0395 | 25.0697 ± 0.2604 | 6.3751 ± 0.0292 |
| NSST | 7.3105 ± 0.1206 | 9.5635 ± 0.1537 | 23.2758 ± 0.4313 | 4.9253 ± 0.0102 |
| GAN | 7.2159 ± 0.0802 | 10.3756 ± 0.0819 | 25.5327 ± 0.3608 | 4.7754 ± 0.0146 |
| Proposed | 7.4225 ± 0.0205 | 10.8597 ± 0.0611 | 26.4568 ± 0.2503 | 7.5754 ± 0.0319 |
Figure 9Oslo city: (a) SAR image; (b) optical image.
Figure 10The fused images of Figure 8: (a) GFF; (b) SR; (c) DWT; (d) CNN; (e) MWGF; (f) MST-SR; (g) NSST; (h) GAN; (i) the proposed method.
The valuable metrics of the fused images in Figure 10.
| EN | AVG | SF | MI | Time(s) | |
|---|---|---|---|---|---|
| GFF | 7.1684 | 10.9946 | 25.8843 | 1.1225 | 0.864875 |
| SR | 7.3631 | 12.3016 | 30.3622 | 3.4416 | 77.549845 |
| DWT | 7.3449 | 12.1866 | 30.2316 | 3.8044 | 30.458764 |
| CNN | 7.3566 | 13.4386 | 32.2077 | 1.4334 | 141.987512 |
| MWGF | 7.4543 | 12.6963 | 31.2014 | 6.3148 | 3.648574 |
| MST-SR | 7.4561 | 12.7560 | 31.3776 | 6.6831 | 71.457981 |
| NSST | 7.4293 | 13.2218 | 31.7040 | 2.1016 | 4.987545 |
| GAN | 7.3815 | 13.8934 | 32.0352 | 1.5428 | 58.145457 |
| Proposed | 7.4694 | 14.7699 | 32.4543 | 7.6206 | 53.125794 |