| Literature DB >> 33265528 |
Abstract
Kullback-Leibler divergence (KLD) is a type of extended mutual entropy, which is used as a measure of information gain when transferring from a prior distribution to a posterior distribution. In this study, KLD is applied to the thermodynamic analysis of cell signal transduction cascade and serves an alternative to mutual entropy. When KLD is minimized, the divergence is given by the ratio of the prior selection probability of the signaling molecule to the posterior selection probability. Moreover, the information gain during the entire channel is shown to be adequately described by average KLD production rate. Thus, this approach provides a framework for the quantitative analysis of signal transduction. Moreover, the proposed approach can identify an effective cascade for a signaling network.Entities:
Keywords: average entropy production rate; fluctuation theorem; signal transduction
Year: 2018 PMID: 33265528 PMCID: PMC7512958 DOI: 10.3390/e20060438
Source DB: PubMed Journal: Entropy (Basel) ISSN: 1099-4300 Impact factor: 2.524
Figure 1A common time course of the j-th step for both prior and posterior cascades, indicating concentration X*. The suffix 0 is omitted. The vertical axis denotes the concentration of signaling active molecule. τ and τ represent the duration of the j-th step and the reverse −j-th step, respectively. The horizontal line X = X denotes the concentration of X at the steady state [35]. The “//” symbol on the horizontal axis indicates −τ or |τ.
Figure 2Theoretical framework of the current study.