Literature DB >> 29902113

Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence.

Yung-Kyun Noh1, Masashi Sugiyama2, Song Liu3, Marthinus C du Plessis4, Frank Chongwoo Park5, Daniel D Lee6.   

Abstract

Nearest-neighbor estimators for the Kullback-Leiber (KL) divergence that are asymptotically unbiased have recently been proposed and demonstrated in a number of applications. However, with a small number of samples, nonparametric methods typically suffer from large estimation bias due to the nonlocality of information derived from nearest-neighbor statistics. In this letter, we show that this estimation bias can be mitigated by modifying the metric function, and we propose a novel method for learning a locally optimal Mahalanobis distance function from parametric generative models of the underlying density distributions. Using both simulations and experiments on a variety of data sets, we demonstrate that this interplay between approximate generative models and nonparametric techniques can significantly improve the accuracy of nearest-neighbor-based estimation of the KL divergence.

Year:  2018        PMID: 29902113     DOI: 10.1162/neco_a_01092

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  1 in total

1.  Transfer Extreme Learning Machine with Output Weight Alignment.

Authors:  Shaofei Zang; Yuhu Cheng; Xuesong Wang; Yongyi Yan
Journal:  Comput Intell Neurosci       Date:  2021-02-11
  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.