ﻻ يوجد ملخص باللغة العربية
There are three ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial Bernoulli processes (Jaynes maximum entropy principle). Even though these notions are fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, $H(p)=-sum_i p_ilog p_i$. For many complex systems, which are typically history-dependent, non-ergodic and non-multinomial, this is no longer the case. Here we show that for such processes the three entropy concepts lead to different functional forms of entropy. We explicitly compute these entropy functionals for three concrete examples. For Polya urn processes, which are simple self-reinforcing processes, the source information rate is $S_{rm IT}=frac{1}{1-c}frac1N log N$, the thermodynamical (extensive) entropy is $(c,d)$-entropy, $S_{rm EXT}=S_{(c,0)}$, and the entropy in the maxent principle (MEP) is $S_{rm MEP}(p)=-sum_i log p_i$. For sample space reducing (SSR) processes, which are simple path-dependent processes that are associated with power law statistics, the information rate is $S_{rm IT}=1+ frac12 log W$, the extensive entropy is $S_{rm EXT}=H(p)$, and the maxent result is $S_{rm MEP}(p)=H(p/p_1)+H(1-p/p_1)$. Finally, for multinomial mixture processes, the information rate is given by the conditional entropy $langle Hrangle_f$, with respect to the mixing kernel $f$, the extensive entropy is given by $H$, and the MEP functional corresponds one-to-one to the logarithm of the mixing kernel.
We study the nonextensive thermodynamics for open systems. On the basis of the maximum entropy principle, the dual power-law q-distribution functions are re-deduced by using the dual particle number definitions and assuming that the chemical potentia
We present an experiment in which a one-bit memory is constructed, using a system of a single colloidal particle trapped in a modulated double-well potential. We measure the amount of heat dissipated to erase a bit and we establish that in the limit
The selection of an equilibrium state by maximising the entropy of a system, subject to certain constraints, is often powerfully motivated as an exercise in logical inference, a procedure where conclusions are reached on the basis of incomplete infor
We investigate the nonequilibrium stationary states of systems consisting of chemical reactions among molecules of several chemical species. To this end we introduce and develop a stochastic formulation of nonequilibrium thermodynamics of chemical re
We apply the Principle of Maximum Entropy to the study of a general class of deterministic fractal sets. The scaling laws peculiar to these objects are accounted for by means of a constraint concerning the average content of information in those patt