| Literature DB >> 33267491 |
Amos Lapidoth1, Christoph Pfister1.
Abstract
Two families of dependence measures between random variables are introduced. They are based on the Rényi divergence of order α and the relative α -entropy, respectively, and both dependence measures reduce to Shannon's mutual information when their order α is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding.Entities:
Keywords: Rényi divergence; Rényi entropy; data processing; dependence measure; relative α-entropy
Year: 2019 PMID: 33267491 PMCID: PMC7515307 DOI: 10.3390/e21080778
Source DB: PubMed Journal: Entropy (Basel) ISSN: 1099-4300 Impact factor: 2.524
Figure 1(Left) and versus . (Right) and versus . In both plots, X is Bernoulli with , and Y is equal to X.