| Literature DB >> 33919986 |
Abstract
We generalize the Jensen-Shannon divergence and the Jensen-Shannon diversity index by considering a variational definition with respect to a generic mean, thereby extending the notion of Sibson's information radius. The variational definition applies to any arbitrary distance and yields a new way to define a Jensen-Shannon symmetrization of distances. When the variational optimization is further constrained to belong to prescribed families of probability measures, we get relative Jensen-Shannon divergences and their equivalent Jensen-Shannon symmetrizations of distances that generalize the concept of information projections. Finally, we touch upon applications of these variational Jensen-Shannon divergences and diversity indices to clustering and quantization tasks of probability measures, including statistical mixtures.Entities:
Keywords: Bhattacharyya distance; Bregman divergence; Bregman information; Fenchel–Young divergence; Jensen-Shannon divergence; Rényi entropy; centroid; clustering; diversity index; exponential family; information projection; information radius; q-divergence; q-exponential family
Year: 2021 PMID: 33919986 PMCID: PMC8071043 DOI: 10.3390/e23040464
Source DB: PubMed Journal: Entropy (Basel) ISSN: 1099-4300 Impact factor: 2.524