Literature DB >> 33919986

On a Variational Definition for the Jensen-Shannon Symmetrization of Distances Based on the Information Radius.

Frank Nielsen1.   

Abstract

We generalize the Jensen-Shannon divergence and the Jensen-Shannon diversity index by considering a variational definition with respect to a generic mean, thereby extending the notion of Sibson's information radius. The variational definition applies to any arbitrary distance and yields a new way to define a Jensen-Shannon symmetrization of distances. When the variational optimization is further constrained to belong to prescribed families of probability measures, we get relative Jensen-Shannon divergences and their equivalent Jensen-Shannon symmetrizations of distances that generalize the concept of information projections. Finally, we touch upon applications of these variational Jensen-Shannon divergences and diversity indices to clustering and quantization tasks of probability measures, including statistical mixtures.

Entities:  

Keywords:  Bhattacharyya distance; Bregman divergence; Bregman information; Fenchel–Young divergence; Jensen-Shannon divergence; Rényi entropy; centroid; clustering; diversity index; exponential family; information projection; information radius; q-divergence; q-exponential family

Year:  2021        PMID: 33919986      PMCID: PMC8071043          DOI: 10.3390/e23040464

Source DB:  PubMed          Journal:  Entropy (Basel)        ISSN: 1099-4300            Impact factor:   2.524


  4 in total

1.  Integration of stochastic models by minimizing alpha-divergence.

Authors:  Shun-ichi Amari
Journal:  Neural Comput       Date:  2007-10       Impact factor: 2.026

2.  Fisher and Jensen-Shannon divergences: Quantitative comparisons among distributions. Application to position and momentum atomic densities.

Authors:  J Antolín; J C Angulo; S López-Rosa
Journal:  J Chem Phys       Date:  2009-02-21       Impact factor: 3.488

3.  Simplifying mixture models through function approximation.

Authors:  James T Kwok; Kai Zhang
Journal:  IEEE Trans Neural Netw       Date:  2010-02-22

4.  On Voronoi Diagrams on the Information-Geometric Cauchy Manifolds.

Authors:  Frank Nielsen
Journal:  Entropy (Basel)       Date:  2020-06-28       Impact factor: 2.524

  4 in total
  4 in total

1.  Analysis on Optimal Error Exponents of Binary Classification for Source with Multiple Subclasses.

Authors:  Hiroto Kuramata; Hideki Yagi
Journal:  Entropy (Basel)       Date:  2022-04-30       Impact factor: 2.738

2.  α-Geodesical Skew Divergence.

Authors:  Masanari Kimura; Hideitsu Hino
Journal:  Entropy (Basel)       Date:  2021-04-25       Impact factor: 2.524

3.  The unique second wave phenomenon in contrast enhanced ultrasound imaging with nanobubbles.

Authors:  Chuan Chen; Reshani Perera; Michael C Kolios; Hessel Wijkstra; Agata A Exner; Massimo Mischi; Simona Turco
Journal:  Sci Rep       Date:  2022-08-10       Impact factor: 4.996

4.  Statistical Divergences between Densities of Truncated Exponential Families with Nested Supports: Duo Bregman and Duo Jensen Divergences.

Authors:  Frank Nielsen
Journal:  Entropy (Basel)       Date:  2022-03-17       Impact factor: 2.524

  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.