| Literature DB >> 33267434 |
Abstract
We give a brief survey of the literature on the empirical estimation of entropy, differential entropy, relative entropy, mutual information and related information measures. While those quantities are of central importance in information theory, universal algorithms for their estimation are increasingly important in data science, machine learning, biology, neuroscience, economics, language, and other experimental sciences.Entities:
Keywords: empirical estimators; entropy; information measures; mutual information; relative entropy; universal estimation
Year: 2019 PMID: 33267434 PMCID: PMC7515235 DOI: 10.3390/e21080720
Source DB: PubMed Journal: Entropy (Basel) ISSN: 1099-4300 Impact factor: 2.524
Figure 1Generation of an estimate for entropy where the middle block is the function , .
Figure 2The function .
Figure 3Generation of estimates for relative entropy.