Ternary Representation of Stochastic Change and the Origin of Entropy and Its Fluctuations


الملخص بالإنكليزية

A change in a stochastic system has three representations: Probabilistic, statistical, and informational: (i) is based on random variable $u(omega)totilde{u}(omega)$; this induces (ii) the probability distributions $F_u(x)to F_{tilde{u}}(x)$, $xinmathbb{R}^n$; and (iii) a change in the probability measure $mathbb{P}totilde{mathbb{P}}$ under the same observable $u(omega)$. In the informational representation a change is quantified by the Radon-Nikodym derivative $lnleft( frac{ d tilde{mathbb{P}}}{ dmathbb{P}}(omega)right)=-lnleft(frac{ d F_u}{ d F_{tilde{u}}}(x)right)$ when $x=u(omega)$. Substituting a random variable into its own density function creates a fluctuating entropy whose expectation has been given by Shannon. Informational representation of a deterministic transformation on $mathbb{R}^n$ reveals entropic and energetic terms, and the notions of configurational entropy of Boltzmann and Gibbs, and potential of mean force of Kirkwood. Mutual information arises for correlated $u(omega)$ and $tilde{u}(omega)$; and a nonequilibrium thermodynamic entropy balance equation is identified.

تحميل البحث