Literature DB >> 25058703

Nonparametric estimation of Küllback-Leibler divergence.

Zhiyi Zhang1, Michael Grabchak.   

Abstract

In this letter, we introduce an estimator of Küllback-Leibler divergence based on two independent samples. We show that on any finite alphabet, this estimator has an exponentially decaying bias and that it is consistent and asymptotically normal. To explain the importance of this estimator, we provide a thorough analysis of the more standard plug-in estimator. We show that it is consistent and asymptotically normal, but with an infinite bias. Moreover, if we modify the plug-in estimator to remove the rare events that cause the bias to become infinite, the bias still decays at a rate no faster than O(1/n). Further, we extend our results to estimating the symmetrized Küllback-Leibler divergence. We conclude by providing simulation results, which show that the asymptotic properties of these estimators hold even for relatively small sample sizes.

Mesh:

Year:  2014        PMID: 25058703     DOI: 10.1162/NECO_a_00646

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  3 in total

1.  Minimax Rate-optimal Estimation of KL Divergence between Discrete Distributions.

Authors:  Yanjun Han; Jiantao Jiao; Tsachy Weissman
Journal:  Int Symp Inf Theory Appl       Date:  2016

2.  Selecting an Effective Entropy Estimator for Short Sequences of Bits and Bytes with Maximum Entropy.

Authors:  Lianet Contreras Rodríguez; Evaristo José Madarro-Capó; Carlos Miguel Legón-Pérez; Omar Rojas; Guillermo Sosa-Gómez
Journal:  Entropy (Basel)       Date:  2021-04-30       Impact factor: 2.524

3.  Empirical Estimation of Information Measures: A Literature Guide.

Authors:  Sergio Verdú
Journal:  Entropy (Basel)       Date:  2019-07-24       Impact factor: 2.524

  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.