Entropy Production and Information Flow for Markov Diffusions with Filtering


الملخص بالإنكليزية

Filtering theory gives an explicit models for the flow of information and thereby quantifies the rates of change of information supplied to and dissipated from the filters memory. Here we extend the analysis of Mitter and Newton from linear Gaussian models to general nonlinear filters involving Markov diffusions.The rates of entropy production are now generally the average squared-field (co-metric) of various logarithmic probability densities, which may be interpreted as Fisher information associate with Gaussian perturbations (via de Bruijns identity). We show that the central connection is made through the Mayer-Wolf and Zakai Theorem for the rate of change of the mutual information between the filtered state and the observation history. In particular, we extend this Theorem to cover a Markov diffusion controlled by observations process, which may be interpreted as the filter acting as a Maxwells Daemon applying feedback to the system.

تحميل البحث