| Literature DB >> 24586124 |
Wouter Boomsma1, Jesper Ferkinghoff-Borg2, Kresten Lindorff-Larsen1.
Abstract
A key component of computational biology is to compare the results of computer modelling with experimental measurements. Despite substantial progress in the models and algorithms used in many areas of computational biology, such comparisons sometimes reveal that the computations are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy applications in our field has grown steadily in recent years, in areas as diverse as sequence analysis, structural modelling, and neurobiology. In this Perspectives article, we give a broad introduction to the method, in an attempt to encourage its further adoption. The general procedure is explained in the context of a simple example, after which we proceed with a real-world application in the field of molecular simulations, where the maximum entropy procedure has recently provided new insight. Given the limited accuracy of force fields, macromolecular simulations sometimes produce results that are at not in complete and quantitative accordance with experiments. A common solution to this problem is to explicitly ensure agreement between the two by perturbing the potential energy function towards the experimental data. So far, a general consensus for how such perturbations should be implemented has been lacking. Three very recent papers have explored this problem using the maximum entropy approach, providing both new theoretical and practical insights to the problem. We highlight each of these contributions in turn and conclude with a discussion on remaining challenges.Entities:
Mesh:
Substances:
Year: 2014 PMID: 24586124 PMCID: PMC3930489 DOI: 10.1371/journal.pcbi.1003406
Source DB: PubMed Journal: PLoS Comput Biol ISSN: 1553-734X Impact factor: 4.475
Figure 1Jaynes' die problem: Maximum entropy probability distributions for a die, after observing the average outcome.
Figure 2The effect of different methods for incorporating experimental data on a simple example consisting of a mixture of two bivariate normal distributions.
In this example, we only have experimental data regarding the y-dimension of the distribution (target value indicated by dotted line). The top row contains the unperturbed and maximum entropy distributions. The matrix shows various combinations of force constant () and number of replicas () when enforcing the restraint through a harmonic potential. In these calculations, corresponds to the standard method for structure calculation, and corresponds to ensemble refinement. In each plot we also show the mean in the y-direction (), and the entropy of the distribution ().