Literature DB >> 26313604

A Note on Entropy Estimation.

Thomas Schürmann1.   

Abstract

We compare an entropy estimator H(z) recently discussed by Zhang (2012) with two estimators, H(1) and H(2), introduced by Grassberger (2003) and Schürmann (2004). We prove the identity H(z) ≡ H(1), which has not been taken into account by Zhang (2012). Then we prove that the systematic error (bias) of H(1) is less than or equal to the bias of the ordinary likelihood (or plug-in) estimator of entropy. Finally, by numerical simulation, we verify that for the most interesting regime of small sample estimation and large event spaces, the estimator H(2) has a significantly smaller statistical error than H(z).

Entities:  

Year:  2015        PMID: 26313604     DOI: 10.1162/NECO_a_00775

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  1 in total

1.  Selecting an Effective Entropy Estimator for Short Sequences of Bits and Bytes with Maximum Entropy.

Authors:  Lianet Contreras Rodríguez; Evaristo José Madarro-Capó; Carlos Miguel Legón-Pérez; Omar Rojas; Guillermo Sosa-Gómez
Journal:  Entropy (Basel)       Date:  2021-04-30       Impact factor: 2.524

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.