| Literature DB >> 33286691 |
Pengcheng Yue1, Hua Qu1,2, Jihong Zhao1,3, Meng Wang2.
Abstract
This paper provides a novel Newtonian-type optimization method for robust adaptive filtering inspired by information theory learning. With the traditional minimum mean square error (MMSE) criterion replaced by criteria like the maximum correntropy criterion (MCC) or generalized maximum correntropy criterion (GMCC), adaptive filters assign less emphasis on the outlier data, thus become more robust against impulsive noises. The optimization methods adopted in current MCC-based LMS-type and RLS-type adaptive filters are gradient descent method and fixed point iteration, respectively. However, in this paper, a Newtonian-type method is introduced as a novel method for enhancing the existing body of knowledge of MCC-based adaptive filtering and providing a fast convergence rate. Theoretical analysis of the steady-state performance of the algorithm is carried out and verified by simulations. The experimental results show that, compared to the conventional MCC adaptive filter, the MCC-based Newtonian-type method converges faster and still maintains a good steady-state performance under impulsive noise. The practicability of the algorithm is also verified in the experiment of acoustic echo cancellation.Entities:
Keywords: Newtonian method; acoustic echo cancellation; maximum correntropy criterion; robust adaptive filter; steady-state performance analysis
Year: 2020 PMID: 33286691 PMCID: PMC7597172 DOI: 10.3390/e22090922
Source DB: PubMed Journal: Entropy (Basel) ISSN: 1099-4300 Impact factor: 2.524
Figure 1Comparision of different loss functions.
Figure 2Weighting functions of different criteria.
Figure 3The performance of different algorithms under Gaussian noise.
Figure 4The performance of different algorithms under stable noise.
Figure 5The performance of the MCC-Newton with different kernel width under Gaussian noise.
Figure 6The performance of the maximum correntropy criterion (MCC)-Newton with different kernel width under stable noise.
Figure 7The weight tracks for the gradient descent-based algorithm (MCC_GD) and the Newtonian-type algorithm (ideal MCC_Newton and MCC_Newton) on the Correntropy performance surface.
Figure 8Theoretical and simulated EMSEs versus the step size .
Figure 9Theoretical and simulated excess mean square errors (EMSEs) versus the kernel width .
Figure 10Theoretical and simulated EMSEs versus the noise variance .
Figure 11Effect of the MCC-Newton algorithm and the echo return loss enhancement performance.
Figure 12Echo return loss enhancement (ERLE) of different algorithms.