| Literature DB >> 23557990 |
Michael Pargett1, David M Umulis.
Abstract
Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data.Keywords: Data integration; Inference; Mathematical modeling; Normalization; Optimization
Mesh:
Substances:
Year: 2013 PMID: 23557990 DOI: 10.1016/j.ymeth.2013.03.024
Source DB: PubMed Journal: Methods ISSN: 1046-2023 Impact factor: 3.608