| Literature DB >> 29902113 |
Yung-Kyun Noh1, Masashi Sugiyama2, Song Liu3, Marthinus C du Plessis4, Frank Chongwoo Park5, Daniel D Lee6.
Abstract
Nearest-neighbor estimators for the Kullback-Leiber (KL) divergence that are asymptotically unbiased have recently been proposed and demonstrated in a number of applications. However, with a small number of samples, nonparametric methods typically suffer from large estimation bias due to the nonlocality of information derived from nearest-neighbor statistics. In this letter, we show that this estimation bias can be mitigated by modifying the metric function, and we propose a novel method for learning a locally optimal Mahalanobis distance function from parametric generative models of the underlying density distributions. Using both simulations and experiments on a variety of data sets, we demonstrate that this interplay between approximate generative models and nonparametric techniques can significantly improve the accuracy of nearest-neighbor-based estimation of the KL divergence.Year: 2018 PMID: 29902113 DOI: 10.1162/neco_a_01092
Source DB: PubMed Journal: Neural Comput ISSN: 0899-7667 Impact factor: 2.026