| Literature DB >> 33267199 |
Abstract
The Jensen-Shannon divergence is a renowned bounded symmetrization of the unbounded Kullback-Leibler divergence which measures the total Kullback-Leibler divergence to the average mixture distribution. However, the Jensen-Shannon divergence between Gaussian distributions is not available in closed form. To bypass this problem, we present a generalization of the Jensen-Shannon (JS) divergence using abstract means which yields closed-form expressions when the mean is chosen according to the parametric family of distributions. More generally, we define the JS-symmetrizations of any distance using parameter mixtures derived from abstract means. In particular, we first show that the geometric mean is well-suited for exponential families, and report two closed-form formula for (i) the geometric Jensen-Shannon divergence between probability densities of the same exponential family; and (ii) the geometric JS-symmetrization of the reverse Kullback-Leibler divergence between probability densities of the same exponential family. As a second illustrating example, we show that the harmonic mean is well-suited for the scale Cauchy distributions, and report a closed-form formula for the harmonic Jensen-Shannon divergence between scale Cauchy distributions. Applications to clustering with respect to these novel Jensen-Shannon divergences are touched upon.Entities:
Keywords: Bhattacharyya distance; Bregman divergence; Cauchy scale family; Gaussian family; Jeffreys divergence; Jensen/Burbea–Rao divergence; Jensen–Shannon divergence; abstract weighted mean; clustering; exponential family; f-divergence; mixture family; quasi-arithmetic mean; resistor average distance; statistical M-mixture
Year: 2019 PMID: 33267199 PMCID: PMC7514974 DOI: 10.3390/e21050485
Source DB: PubMed Journal: Entropy (Basel) ISSN: 1099-4300 Impact factor: 2.524
Summary of Distances and Their Notations.
| Weighted mean | |
| Arithmetic mean |
|
| Geometric mean |
|
| Harmonic mean |
|
| Power mean | |
| Quasi-arithmetic mean | |
|
|
|
| Dual/reverse distance |
|
| Kullback-Leibler divergence |
|
| reverse Kullback-Leibler divergence |
|
| Jeffreys divergence |
|
| Resistor divergence | |
| skew |
|
| Jensen-Shannon divergence |
|
| skew Bhattacharrya divergence |
|
| Hellinger distance |
|
|
| |
| Mahalanobis distance | |
| reverse | |
| J-symmetrized |
|
| JS-symmetrized | |
|
| |
| Bregman divergence |
|
| skew Jeffreys-Bregman divergence |
|
| skew Jensen divergence |
|
| Jensen-Bregman divergence | |
|
| |
| skew |
|
| skew |
|
| skew |
|
| skew |
|
|
| |
|
| |
| skew |
|
Summary of the weighted means M chosen according to the parametric family in order to ensure that the family is closed under M-mixturing: .
|
| Mean | Parametric Family |
|
|---|---|---|---|
|
| arithmetic | mixture family |
|
|
| geometric | exponential family |
|
|
| harmonic | Cauchy scale family |
|