| Literature DB >> 26313604 |
Abstract
We compare an entropy estimator H(z) recently discussed by Zhang (2012) with two estimators, H(1) and H(2), introduced by Grassberger (2003) and Schürmann (2004). We prove the identity H(z) ≡ H(1), which has not been taken into account by Zhang (2012). Then we prove that the systematic error (bias) of H(1) is less than or equal to the bias of the ordinary likelihood (or plug-in) estimator of entropy. Finally, by numerical simulation, we verify that for the most interesting regime of small sample estimation and large event spaces, the estimator H(2) has a significantly smaller statistical error than H(z).Entities:
Year: 2015 PMID: 26313604 DOI: 10.1162/NECO_a_00775
Source DB: PubMed Journal: Neural Comput ISSN: 0899-7667 Impact factor: 2.026