| Literature DB >> 35741558 |
Nizar Bouhlel1, David Rousseau2.
Abstract
This paper introduces a closed-form expression for the Kullback-Leibler divergence (KLD) between two central multivariate Cauchy distributions (MCDs) which have been recently used in different signal and image processing applications where non-Gaussian models are needed. In this overview, the MCDs are surveyed and some new results and properties are derived and discussed for the KLD. In addition, the KLD for MCDs is showed to be written as a function of Lauricella D-hypergeometric series FD(p). Finally, a comparison is made between the Monte Carlo sampling method to approximate the KLD and the numerical value of the closed-form expression of the latter. The approximation of the KLD by Monte Carlo sampling method are shown to converge to its theoretical value when the number of samples goes to the infinity.Entities:
Keywords: Kullback–Leibler divergence (KLD); Lauricella D-hypergeometric series; Multivariate Cauchy distribution (MCD); multiple power series
Year: 2022 PMID: 35741558 PMCID: PMC9222751 DOI: 10.3390/e24060838
Source DB: PubMed Journal: Entropy (Basel) ISSN: 1099-4300 Impact factor: 2.738
KLD and KL distance computed when and are two random vectors following central MCDs with pdfs and .
|
|
|
|
|
|
Computation of and when and .
|
|
|
| |||||
|---|---|---|---|---|---|---|---|
|
|
|
|
|
|
|
|
|
| 0.1 | 0.0694 | 0.0694 | 9.1309 | 0.0694 | 9.1309 | 0.0694 | 9.1309 |
| 0.3 | 0.2291 | 0.2291 | 3.7747 | 0.2291 | 1.1102 | 0.2291 | 1.1102 |
| 0.5 | 0.4292 | 0.4292 | 2.6707 | 0.4292 | 1.2458 | 0.4292 | 6.6613 |
| 0.7 | 0.7022 | 0.7022 | 5.9260 | 0.7022 | 8.2678 | 0.7022 | 1.3911 |
| 0.9 | 1.1673 | 1.1634 | 0.0038 | 1.1665 | 7.2760 | 1.1671 | 1.6081 |
| 0.99 | 1.7043 | 1.5801 | 0.1241 | 1.6267 | 0.0776 | 1.6514 | 0.0529 |
Parameters and used to compute KLD for central MCD.
|
| |
|
| 1, 1, 1, 0.6, 0.2, 0.3 |
|
| 1, 1, 1, 0.3, 0.1, 0.4 |
Figure 1Top row: Bias (left) and MSE (right) of the difference between the approximated and theoretical symmetric KL for MCD. Bottom row: Box plot of the error. The mean error is the bias. Outliers are larger than or smaller than , where , , and are the 25th, 75th percentiles, and the interquartile range, respectively.