| Literature DB >> 33562882 |
Abstract
Over the last six decades, the representation of error exponent functions for data transmission through noisy channels at rates below capacity has seen three distinct approaches: (1) Through Gallager's E0 functions (with and without cost constraints); (2) large deviations form, in terms of conditional relative entropy and mutual information; (3) through the α-mutual information and the Augustin-Csiszár mutual information of order α derived from the Rényi divergence[...].Entities:
Keywords: Augustin–Csiszár mutual information; Rényi divergence; data transmission; error exponents; information measures; large deviations; mutual information; relative entropy; α-mutual information
Year: 2021 PMID: 33562882 DOI: 10.3390/e23020199
Source DB: PubMed Journal: Entropy (Basel) ISSN: 1099-4300 Impact factor: 2.524