No Arabic abstract
The Shannon entropy, one of the cornerstones of information theory, is widely used in physics, particularly in statistical mechanics. Yet its characterization and connection to physics remain vague, leaving ample room for misconceptions and misunderstanding. We will show that the Shannon entropy can be fully understood as measuring the variability of the elements within a given distribution: it characterizes how much variation can be found within a collection of objects. We will see that it is the only indicator that is continuous and linear, that it quantifies the number of yes/no questions (i.e. bits) that are needed to identify an element within the distribution, and we will see how applying this concept to statistical mechanics in different ways leads to the Boltzmann, Gibbs and von Neumann entropies.
We present a study of the equilibration process of nonequilibrium systems by means of molecular dynamics simulation technique. The nonequilibrium conditions are achieved in systems by defining velocity components of the constituent atoms randomly. The calculated Shannon en- tropy from the probability distribution of the kinetic energy among the atoms at different instants during the process of equilibration shows oscillation as the system relaxes towards equilibrium. Fourier transformations of these oscillating Shannon entropies reveal the existance of Debye frequency of the concerned system. From these studies it was concluded that the signature of the equilibration process of dynamical systems is the time invariance of Shannon entropy.
Many theoretical expressions of dissipation along non-equilibrium processes have been proposed. However, they have not been fully verified by experiments. Especially for systems strongly interacting with environments the connection between theoretical quantities and standard thermodynamic observables are not clear. We have developed a computer simulation based on a spin-boson model, which is in principle exact and suitable for testing the proposed theories. We have noted that the dissipation obtained by measuring conventional thermodynamic quantities deviates from the second law of thermodynamics presumably due to the strong coupling. We show that additive correction to entropy makes it more consistent with the second law. This observation appears to be consistent with the theory based on the potential of mean force.
Many thermodynamic relations involve inequalities, with equality if a process does not involve dissipation. In this article we provide equalities in which the dissipative contribution is shown to involve the relative entropy (a.k.a. Kullback-Leibler divergence). The processes considered are general time evolutions both in classical and quantum mechanics, and the initial state is sometimes thermal, sometimes partially so. As an application, the relative entropy is related to transport coefficients.
Bridging the second law of thermodynamics and microscopic reversible dynamics has been a longstanding problem in statistical physics. We here address this problem on the basis of quantum many-body physics, and discuss how the entropy production saturates in isolated quantum systems under unitary dynamics. First, we rigorously prove the saturation of the entropy production in the long time regime, where a total system can be in a pure state. Second, we discuss the non-negativity of the entropy production at saturation, implying the second law of thermodynamics. This is based on the eigenstate thermalization hypothesis (ETH), which states that even a single energy eigenstate is thermal. We also numerically demonstrate that the entropy production saturates at a non-negative value even when the initial state of a heat bath is a single energy eigenstate. Our results reveal fundamental properties of the entropy production in isolated quantum systems at late times.
The time evolution of an extended quantum system can be theoretically described in terms of the Schwinger-Keldysh functional integral formalism, whose action conveniently encodes the information about the dynamics. We show here that the action of quantum systems evolving in thermal equilibrium is invariant under a symmetry transformation which distinguishes them from generic open systems. A unitary or dissipative dynamics having this symmetry naturally leads to the emergence of a Gibbs thermal stationary state. Moreover, the fluctuation-dissipation relations characterizing the linear response of an equilibrium system to external perturbations can be derived as the Ward-Takahashi identities associated with this symmetry. Accordingly, the latter provides an efficient check for the onset of thermodynamic equilibrium and it makes testing the validity of fluctuation-dissipation relations unnecessary. In the classical limit, this symmetry renders the one which is known to characterize equilibrium in the stochastic dynamics of classical systems coupled to thermal baths, described by Langevin equations.