Literature DB >> 18544803

A measure of the information content of EIT data.

Andy Adler1, Richard Youmaran, William R B Lionheart.   

Abstract

We ask: how many bits of information (in the Shannon sense) do we get from a set of EIT measurements? Here, the term information in measurements (IM) is defined as: the decrease in uncertainty about the contents of a medium, due to a set of measurements. This decrease in uncertainty is quantified by the change from the inter-class model, q, defined by the prior information, to the intra-class model, p, given by the measured data (corrupted by noise). IM is measured by the expected relative entropy (Kullback-Leibler divergence) between distributions q and p, and corresponds to the channel capacity in an analogous communications system. Based on a Gaussian model of the measurement noise, (Sigma(n)), and a prior model of the image element covariances (Sigma(x)), we calculate IM = 1/2 summation operator log(2)([SNR](i) + 1), where [SNR](i) is the signal-to-noise ratio for each independent measurement calculated from the prior and noise models. For an example, we consider saline tank measurements from a 16 electrode EIT system, with a 2 cm radius non-conductive target, and calculate IM =179 bits. Temporal sequences of frames are considered, and formulae for IM as a function of temporal image element correlations are derived. We suggest that this measure may allow novel insights into questions such as distinguishability limits, optimal measurement schemes and data fusion.

Entities:  

Mesh:

Year:  2008        PMID: 18544803     DOI: 10.1088/0967-3334/29/6/S09

Source DB:  PubMed          Journal:  Physiol Meas        ISSN: 0967-3334            Impact factor:   2.833


  1 in total

Review 1.  Robust imaging using electrical impedance tomography: review of current tools.

Authors:  Benoit Brazey; Yassine Haddab; Nabil Zemiti
Journal:  Proc Math Phys Eng Sci       Date:  2022-02-02       Impact factor: 2.704

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.