No Arabic abstract
There are three ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial Bernoulli processes (Jaynes maximum entropy principle). Even though these notions are fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, $H(p)=-sum_i p_ilog p_i$. For many complex systems, which are typically history-dependent, non-ergodic and non-multinomial, this is no longer the case. Here we show that for such processes the three entropy concepts lead to different functional forms of entropy. We explicitly compute these entropy functionals for three concrete examples. For Polya urn processes, which are simple self-reinforcing processes, the source information rate is $S_{rm IT}=frac{1}{1-c}frac1N log N$, the thermodynamical (extensive) entropy is $(c,d)$-entropy, $S_{rm EXT}=S_{(c,0)}$, and the entropy in the maxent principle (MEP) is $S_{rm MEP}(p)=-sum_i log p_i$. For sample space reducing (SSR) processes, which are simple path-dependent processes that are associated with power law statistics, the information rate is $S_{rm IT}=1+ frac12 log W$, the extensive entropy is $S_{rm EXT}=H(p)$, and the maxent result is $S_{rm MEP}(p)=H(p/p_1)+H(1-p/p_1)$. Finally, for multinomial mixture processes, the information rate is given by the conditional entropy $langle Hrangle_f$, with respect to the mixing kernel $f$, the extensive entropy is given by $H$, and the MEP functional corresponds one-to-one to the logarithm of the mixing kernel.
We study the nonextensive thermodynamics for open systems. On the basis of the maximum entropy principle, the dual power-law q-distribution functions are re-deduced by using the dual particle number definitions and assuming that the chemical potential is constant in the two sets of parallel formalisms, where the fundamental thermodynamic equations with dual interpretations of thermodynamic quantities are derived for the open systems. By introducing parallel structures of Legendre transformations, other thermodynamic equations with dual interpretations of quantities are also deduced in the open systems, and then several dual thermodynamic relations are inferred. One can easily find that there are correlations between the dual relations, from which an equivalent rule is found that the Tsallis factor is invariable in calculations of partial derivative with constant volume or constant entropy. Using this rule, more correlations can be found. And the statistical expressions of the Lagrange internal energy and pressure are easily obtained.
We present an experiment in which a one-bit memory is constructed, using a system of a single colloidal particle trapped in a modulated double-well potential. We measure the amount of heat dissipated to erase a bit and we establish that in the limit of long erasure cycles the mean dissipated heat saturates at the Landauer bound, i.e. the minimal quantity of heat necessarily produced to delete a classical bit of information. This result demonstrates the intimate link between information theory and thermodynamics. To stress this connection we also show that a detailed Jarzynski equality is verified, retrieving the Landauers bound independently of the work done on the system. The experimental details are presented and the experimental errors carefully discussed
The selection of an equilibrium state by maximising the entropy of a system, subject to certain constraints, is often powerfully motivated as an exercise in logical inference, a procedure where conclusions are reached on the basis of incomplete information. But such a framework can be more compelling if it is underpinned by dynamical arguments, and we show how this can be provided by stochastic thermodynamics, where an explicit link is made between the production of entropy and the stochastic dynamics of a system coupled to an environment. The separation of entropy production into three components allows us to select a stationary state by maximising the change, averaged over all realisations of the motion, in the principal relaxational or nonadiabatic component, equivalent to requiring that this contribution to the entropy production should become time independent for all realisations. We show that this recovers the usual equilibrium probability density function (pdf) for a conservative system in an isothermal environment, as well as the stationary nonequilibrium pdf for a particle confined to a potential under nonisothermal conditions, and a particle subject to a constant nonconservative force under isothermal conditions. The two remaining components of entropy production account for a recently discussed thermodynamic anomaly between over- and underdamped treatments of the dynamics in the nonisothermal stationary state.
We investigate the nonequilibrium stationary states of systems consisting of chemical reactions among molecules of several chemical species. To this end we introduce and develop a stochastic formulation of nonequilibrium thermodynamics of chemical reaction systems based on a master equation defined on the space of microscopic chemical states, and on appropriate definitions of entropy and entropy production, The system is in contact with a heat reservoir, and is placed out of equilibrium by the contact with particle reservoirs. In our approach, the fluxes of various types, such as the heat and particle fluxes, play a fundamental role in characterizing the nonequilibrium chemical state. We show that the rate of entropy production in the stationary nonequilibrium state is a bilinear form in the affinities and the fluxes of reaction, which are expressed in terms of rate constants and transition rates, respectively. We also show how the description in terms of microscopic states can be reduced to a description in terms of the numbers of particles of each species, from which follows the chemical master equation. As an example, we calculate the rate of entropy production of the first and second Schlogl reaction models.
We apply the Principle of Maximum Entropy to the study of a general class of deterministic fractal sets. The scaling laws peculiar to these objects are accounted for by means of a constraint concerning the average content of information in those patterns. This constraint allows for a new statistical characterization of fractal objects and fractal dimension.