| Literature DB >> 17716012 |
Abstract
When there are a number of stochastic models in the form of probability distributions, one needs to integrate them. Mixtures of distributions are frequently used, but exponential mixtures also provide a good means of integration. This letter proposes a one-parameter family of integration, called alpha-integration, which includes all of these well-known integrations. These are generalizations of various averages of numbers such as arithmetic, geometric, and harmonic averages. There are psychophysical experiments that suggest that alpha-integrations are used in the brain. The alpha-divergence between two distributions is defined, which is a natural generalization of Kullback-Leibler divergence and Hellinger distance, and it is proved that alpha-integration is optimal in the sense of minimizing alpha-divergence. The theory is applied to generalize the mixture of experts and the product of experts to the alpha-mixture of experts. The alpha-predictive distribution is also stated in the Bayesian framework.Mesh:
Year: 2007 PMID: 17716012 DOI: 10.1162/neco.2007.19.10.2780
Source DB: PubMed Journal: Neural Comput ISSN: 0899-7667 Impact factor: 2.026