| Literature DB >> 15600480 |
Ariel Caticha1, Roland Preuss.
Abstract
The problem of assigning probability distributions which reflect the prior information available about experiments is one of the major stumbling blocks in the use of Bayesian methods of data analysis. In this paper the method of maximum (relative) entropy (ME) is used to translate the information contained in the known form of the likelihood into a prior distribution for Bayesian inference. The argument is inspired and guided by intuition gained from the successful use of ME methods in statistical mechanics. For experiments that cannot be repeated the resulting "entropic prior" is formally identical with the Einstein fluctuation formula. For repeatable experiments, however, the expected value of the entropy of the likelihood turns out to be relevant information that must be included in the analysis. The important case of a Gaussian likelihood is treated in detail.Year: 2004 PMID: 15600480 DOI: 10.1103/PhysRevE.70.046127
Source DB: PubMed Journal: Phys Rev E Stat Nonlin Soft Matter Phys ISSN: 1539-3755