No Arabic abstract
A central concept in the connection between physics and information theory is entropy, which represents the amount of information extracted from the system by the observer performing measurements in an experiment. Indeed, Jaynes principle of maximum entropy allows to establish the connection between entropy in statistical mechanics and information entropy. In this sense, the dissipated energy in a classical Hamiltonian process, known as the thermodynamic entropy production, is connected to the relative entropy between the forward and backward probability densities. Recently, it was revealed that energetic inefficiency and model inefficiency, defined as the difference in mutual information that the system state shares with the future and past environmental variables, are equivalent concepts in Markovian processes. As a consequence, the question about a possible connection between model unpredictability and energetic inefficiency in the framework of classical physics emerges. Here, we address this question by connecting the concepts of random behavior of a classical Hamiltonian system, the Kolmogorov-Sinai entropy, with its energetic inefficiency, the dissipated work. This approach allows us to provide meaningful interpretations of information concepts in terms of thermodynamic quantities.
We use the kinetic theory of gases to compute the Kolmogorov-Sinai entropy per particle for a dilute gas in equilibrium. For an equilibrium system, the KS entropy, h_KS is the sum of all of the positive Lyapunov exponents characterizing the chaotic behavior of the gas. We compute h_KS/N, where N is the number of particles in the gas. This quantity has a density expansion of the form h_KS/N = a u[-ln{tilde{n}} + b + O(tilde{n})], where u is the single-particle collision frequency and tilde{n} is the reduced number density of the gas. The theoretical values for the coefficients a and b are compared with the results of computer simulations, with excellent agreement for a, and less than satisfactory agreement for b. Possible reasons for this difference in b are discussed.
We prove that the classical capacity of an arbitrary quantum channel assisted by a free classical feedback channel is bounded from above by the maximum average output entropy of the quantum channel. As a consequence of this bound, we conclude that a classical feedback channel does not improve the classical capacity of a quantum erasure channel, and by taking into account energy constraints, we conclude the same for a pure-loss bosonic channel. The method for establishing the aforementioned entropy bound involves identifying an information measure having two key properties: 1) it does not increase under a one-way local operations and classical communication channel from the receiver to the sender and 2) a quantum channel from sender to receiver cannot increase the information measure by more than the maximum output entropy of the channel. This information measure can be understood as the sum of two terms, with one corresponding to classical correlation and the other to entanglement.
Many modern techniques employed in physics, such a computation of path integrals, rely on random walks on graphs that can be represented as Markov chains. Traditionally, estimates of running times of such sampling algorithms are computed using the number of steps in the chain needed to reach the stationary distribution. This quantity is generally defined as mixing time and is often difficult to compute. In this paper, we suggest an alternative estimate based on the Kolmogorov-Sinai entropy, by establishing a link between the maximization of KSE and the minimization of the mixing time. Since KSE are easier to compute in general than mixing time, this link provides a new faster method to approximate the minimum mixing time that could be interesting in computer sciences and statistical physics. Beyond this, our finding will also be of interest to the out-of-equilibrium community, by providing a new rational to select stationary states in out-of-equilibrium physics: it seems reasonable that in a physical system with two simultaneous equiprobable possible dynamics, the final stationary state will be closer to the stationary state corresponding to the fastest dynamics (smallest mixing time).Through the empirical link found in this letter, this state will correspond to a state of maximal Kolmogorov-Sinai entropy. If this is true, this would provide a more satisfying rule for selecting stationary states in complex systems such as climate than the maximization of the entropy production.
Given two pairs of quantum states, a fundamental question in the resource theory of asymmetric distinguishability is to determine whether there exists a quantum channel converting one pair to the other. In this work, we reframe this question in such a way that a catalyst can be used to help perform the transformation, with the only constraint on the catalyst being that its reduced state is returned unchanged, so that it can be used again to assist a future transformation. What we find here, for the special case in which the states in a given pair are commuting, and thus quasi-classical, is that this catalytic transformation can be performed if and only if the relative entropy of one pair of states is larger than that of the other pair. This result endows the relative entropy with a fundamental operational meaning that goes beyond its traditional interpretation in the setting of independent and identical resources. Our finding thus has an immediate application and interpretation in the resource theory of asymmetric distinguishability, and we expect it to find application in other domains.
The quantum entropy-typical subspace theory is specified. It is shown that any mixed state with von Neumann entropy less than h can be preserved approximately by the entropy-typical subspace with entropy= h. This result implies an universal compression scheme for the case that the von Neumann entropy of the source does not exceed h.