| Literature DB >> 35800209 |
P Ajay1, B Nagaraj2, R Arun Kumar3, Ruihang Huang4, P Ananthi5.
Abstract
Hyperspectral microscopy in biology and minerals, unsupervised deep learning neural network denoising SRS photos: hyperspectral resolution enhancement and denoising one hyperspectral picture is enough to teach unsupervised method. An intuitive chemical species map for a lithium ore sample is produced using k-means clustering. Many researchers are now interested in biosignals. Uncertainty limits the algorithms' capacity to evaluate these signals for further information. Even while AI systems can answer puzzles, they remain limited. Deep learning is used when machine learning is inefficient. Supervised learning needs a lot of data. Deep learning is vital in modern AI. Supervised learning requires a large labeled dataset. The selection of parameters prevents over- or underfitting. Unsupervised learning is used to overcome the challenges outlined above (performed by the clustering algorithm). To accomplish this, two processing processes were used: (1) utilizing nonlinear deep learning networks to turn data into a latent feature space (Z). The Kullback-Leibler divergence is used to test the objective function convergence. This article explores a novel research on hyperspectral microscopic picture using deep learning and effective unsupervised learning.Entities:
Mesh:
Year: 2022 PMID: 35800209 PMCID: PMC9192273 DOI: 10.1155/2022/1200860
Source DB: PubMed Journal: Scanning ISSN: 0161-0457 Impact factor: 1.750
Figure 1Deep clustering networks with stacked autoencoder.
Figure 2Microscopic dataset list.
Figure 3Contingency matrix table.
Figure 4Microscopic segmented dataset list.
Figure 5Comparison of clustering algorithms (for noisy data).
Figure 6Clustering algorithms for noiseless data comparison.
Noisy data comparison on clustering algorithms.
| Data | FCM | FCM_S | En-FCM | FLICM | M-FLMCM | Proposed DEC |
|---|---|---|---|---|---|---|
| I1 | 62.71 | 72.30 | 67.22 | 75.25 | 81.25 | 83.10 |
| I2 | 62.90 | 71.55 | 67.95 | 76.63 | 81.10 | 83.38 |
| I3 | 61.85 | 73.82 | 67.12 | 75.53 | 80.01 | 82.86 |
| I4 | 62.95 | 74.16 | 66.97 | 74.51 | 79.46 | 83.16 |
| I5 | 60.98 | 70.80 | 67.86 | 74.58 | 78.75 | 82.25 |
Noiseless data comparison on clustering algorithm.
| Data | FCM | FCM-S | En-FCM | FLICM | M-FLMCM | Proposed DEC |
|---|---|---|---|---|---|---|
| I1 | 84.53 | 84.88 | 84.99 | 82.74 | 84.89 | 89.026 |
| I2 | 83.81 | 83.52 | 84.28 | 82.13 | 84.56 | 82.726 |
| I3 | 83.21 | 82.07 | 81.40 | 82.34 | 84.57 | 90.811 |
| I4 | 82.24 | 84.79 | 84.13 | 82.91 | 84.31 | 83.563 |
| I5 | 81.90 | 83.96 | 84.00 | 82.19 | 82.33 | 84.999 |
Noisy variance data comparison on clustering algorithms.
| Noise variance | FCM | FCM_S | En-FCM | FLICM | M-FLMCM | Proposed DEC |
|---|---|---|---|---|---|---|
| 0.2 | 81.80 | 83.16 | 82.15 | 83.26 | 83.62 | 84.26 |
| 0.4 | 78.08 | 76.12 | 78.30 | 78.97 | 76.06 | 79.50 |
| 0.6 | 71.91 | 73.59 | 71.70 | 72.25 | 71.97 | 75.00 |
| 0.8 | 68.70 | 67.53 | 67.44 | 66.52 | 67.39 | 70.77 |
Figure 7Clustering algorithms for noisy variance data comparison.