| Literature DB >> 33267302 |
Tao Zhang1,2, Shiyuan Wang1,2, Haonan Zhang1,2, Kui Xiong1,2, Lin Wang1,2.
Abstract
As a nonlinear similarity measure defined in the reproducing kernel Hilbert space (RKHS), the correntropic loss (C-Loss) has been widely applied in robust learning and signal processing. However, the highly non-convex nature of C-Loss results in performance degradation. To address this issue, a convex kernel risk-sensitive loss (KRL) is proposed to measure the similarity in RKHS, which is the risk-sensitive loss defined as the expectation of an exponential function of the squared estimation error. In this paper, a novel nonlinear similarity measure, namely kernel risk-sensitive mean p-power error (KRP), is proposed by combining the mean p-power error into the KRL, which is a generalization of the KRL measure. The KRP with p = 2 reduces to the KRL, and can outperform the KRL when an appropriate p is configured in robust learning. Some properties of KRP are presented for discussion. To improve the robustness of the kernel recursive least squares algorithm (KRLS) and reduce its network size, two robust recursive kernel adaptive filters, namely recursive minimum kernel risk-sensitive mean p-power error algorithm (RMKRP) and its quantized RMKRP (QRMKRP), are proposed in the RKHS under the minimum kernel risk-sensitive mean p-power error (MKRP) criterion, respectively. Monte Carlo simulations are conducted to confirm the superiorities of the proposed RMKRP and its quantized version.Entities:
Keywords: correntropic; kernel adaptive filters; kernel risk-sensitive mean p-power error; quantized; recursive
Year: 2019 PMID: 33267302 PMCID: PMC7515077 DOI: 10.3390/e21060588
Source DB: PubMed Journal: Entropy (Basel) ISSN: 1099-4300 Impact factor: 2.524
Figure 1Block diagram of adaptive filtering.
Figure 2Steady-state MSE of RMKRP with different p in MG time series prediction (a); steady-state MSE of RMKRP with different in MG time series prediction (b); steady-state MSE of RMKRP with different in MG time series prediction (c).
Figure 3Comparison of the MSEs of KLMS, KMCC, MKRL, KRLS, KRMC, and RMKRP in MG time series prediction (a); comparison of the MSEs of QKLMS, QKMCC, QMKRL, QKRLS, KRMC-NC, and QRMKRP in MG time series prediction (b).
Simulation results of QKLMS, QKMCC, QMKRL, QKRLS, KRMC-NC, KLMS, KMCC, MKRL, KRLS, KRMC, RMKRP, and QRMKRP in MG time series prediction.
| Algorithms | Size | Time (s) | MSE (dB) |
|---|---|---|---|
| KLMS [ | 2000 | 30.9501 s | N/A |
| QKLMS [ | 28 | 2.1011 s | N/A |
| KRLS [ | 2000 | 58.5358 s | N/A |
| QKRLS [ | 28 | 2.3374 s | N/A |
| KMCC [ | 2000 | 30.8285 s | −18.5063 |
| QKMCC [ | 28 | 2.0995 s | −17.8707 |
| MKRL [ | 2000 | 30.9117 s | −18.7312 |
| QMKRL [ | 28 | 2.1063 s | −18.1037 |
| KRMC [ | 2000 | 58.1229 s | −25.1618 |
| KRMC-NC [ | 462 | 2.8045 s | −21.5183 |
| QRMKRP | 28 | 2.3443 s | −24.9326 |
| RMKRP | 2000 | 58.2196 s | −28.1802 |
Figure 4Steady-state MSE of RMKRP with different p in nonlinear system identification (a); steady-state MSE of RMKRP with different in nonlinear system identification (b); steady-state MSE of RMKRP with different in nonlinear system identification (c).
Figure 5Comparison of the MSEs of KLMS, KMCC, MKRL, KRLS, KRMC, and RMKRP in nonlinear system identification (a); comparison of the MSEs of QKLMS, QKMCC, QMKRL, QKRLS, KRMC-NC, and QRMKRP nonlinear system identification (b).
Simulation results of QKLMS, QKMCC, QMKRL, QKRLS, KRMC-NC, KLMS, KMCC, MKRL, KRLS, KRMC, RMKRP, and QRMKRP in nonlinear system identification.
| Algorithms | Size | Time (s) | MSE (dB) |
|---|---|---|---|
| KLMS [ | 2000 | 21.2447 s | N/A |
| QKLMS [ | 14 | 1.7284 s | N/A |
| KRLS [ | 2000 | 48.6055 s | N/A |
| QKRLS [ | 14 | 1.9643 s | N/A |
| KMCC [ | 2000 | 21.1328 s | −19.233 |
| QKMCC [ | 14 | 1.763 s | −17.9723 |
| MKRL [ | 2000 | 21.0313 s | −19.5390 |
| QMKRL [ | 14 | 1.7243 s | −18.5748 |
| KRMC [ | 2000 | 48.7601 s | −28.7583 |
| KRMC-NC [ | 496 | 2.6874 s | −23.671 |
| QRMKRP | 14 | 1.9681 s | −27.3128 |
| RMKRP | 2000 | 48.6101 s | −34.0790 |