The three faces of entropy for complex systems -- information, thermodynamics and the maxent principle


الملخص بالإنكليزية

There are three ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial Bernoulli processes (Jaynes maximum entropy principle). Even though these notions are fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, $H(p)=-sum_i p_ilog p_i$. For many complex systems, which are typically history-dependent, non-ergodic and non-multinomial, this is no longer the case. Here we show that for such processes the three entropy concepts lead to different functional forms of entropy. We explicitly compute these entropy functionals for three concrete examples. For Polya urn processes, which are simple self-reinforcing processes, the source information rate is $S_{rm IT}=frac{1}{1-c}frac1N log N$, the thermodynamical (extensive) entropy is $(c,d)$-entropy, $S_{rm EXT}=S_{(c,0)}$, and the entropy in the maxent principle (MEP) is $S_{rm MEP}(p)=-sum_i log p_i$. For sample space reducing (SSR) processes, which are simple path-dependent processes that are associated with power law statistics, the information rate is $S_{rm IT}=1+ frac12 log W$, the extensive entropy is $S_{rm EXT}=H(p)$, and the maxent result is $S_{rm MEP}(p)=H(p/p_1)+H(1-p/p_1)$. Finally, for multinomial mixture processes, the information rate is given by the conditional entropy $langle Hrangle_f$, with respect to the mixing kernel $f$, the extensive entropy is given by $H$, and the MEP functional corresponds one-to-one to the logarithm of the mixing kernel.

تحميل البحث