| Literature DB >> 33286090 |
Cédric Bleuler1, Amos Lapidoth1, Christoph Pfister1.
Abstract
Motivated by a horse betting problem, a new conditional Rényi divergence is introduced. It is compared with the conditional Rényi divergences that appear in the definitions of the dependence measures by Csiszár and Sibson, and the properties of all three are studied with emphasis on their behavior under data processing. In the same way that Csiszár's and Sibson's conditional divergence lead to the respective dependence measures, so does the new conditional divergence lead to the Lapidoth-Pfister mutual information. Moreover, the new conditional divergence is also related to the Arimoto-Rényi conditional entropy and to Arimoto's measure of dependence. In the second part of the paper, the horse betting problem is analyzed where, instead of Kelly's expected log-wealth criterion, a more general family of power-mean utility functions is considered. The key role in the analysis is played by the Rényi divergence, and in the setting where the gambler has access to side information, the new conditional Rényi divergence is key. The setting with side information also provides another operational meaning to the Lapidoth-Pfister mutual information. Finally, a universal strategy for independent and identically distributed races is presented that-without knowing the winning probabilities or the parameter of the utility function-asymptotically maximizes the gambler's utility function.Entities:
Keywords: Kelly gambling; Rényi divergence; Rényi mutual information; conditional Rényi divergence; horse betting
Year: 2020 PMID: 33286090 PMCID: PMC7516775 DOI: 10.3390/e22030316
Source DB: PubMed Journal: Entropy (Basel) ISSN: 1099-4300 Impact factor: 2.524