Electrical Impedance Tomography (EIT) calculates internal conductivity from surface measurements; image reconstruction is most commonly formulated as an inverse problem using regularization techniques. Regularization adds "prior information" to address the solution ill-conditioning. This paper presents a novel approach to understand and quantify this information. We ask: how many bits of information (in the Shannon sense) do we get from an EIT data frame. We define the term information in measurements (IM) as the: decrease in uncertainty about the contents of a medium, due to a set of measurements. Before the measurement, we know the prior information (inter-class model, q). The measured data tell us about the medium (which, corrupted by noise, gives the intra-class model, p). The measurement information is given by the relative entropy (or Kullback-Leibler divergence). Based on this expression, and given a noise covariance Σn and a rior model of the element covariances Σx, IM= 1/2log2|JΣxJ TΣn -1 + I|. Under the simplification that measurement and noise covarianes are uncorrelated, IM may be approximated as a function of the signal to noise ratio and the Jacobian and prior matrices. For an example 16 electrode EIT system, IM was calculated to be 245.1 bits. Finally, several applications of an information measure for EIT are given.

Additional Metadata
Keywords Electrical impedance tomography, Kullback Leibler divergence, Measurement information
Conference 13th International Conference on Electrical Bioimpedance and the 8th Conference on Electrical Impedance Tomography 2007, ICEBI 2007
Citation
Adler, A, & Lionheart, W.R.B. (William R. B.). (2007). Information content of EIT measurements. Presented at the 13th International Conference on Electrical Bioimpedance and the 8th Conference on Electrical Impedance Tomography 2007, ICEBI 2007.