No Arabic abstract
One of the major resource requirements of computers - ranging from biological cells to human brains to high-performance (engineered) computers - is the energy used to run them. Those costs of performing a computation have long been a focus of research in physics, going back to the early work of Landauer. One of the most prominent aspects of computers is that they are inherently nonequilibrium systems. However, the early research was done when nonequilibrium statistical physics was in its infancy, which meant the work was formulated in terms of equilibrium statistical physics. Since then there have been major breakthroughs in nonequilibrium statistical physics, which are allowing us to investigate the myriad aspects of the relationship between statistical physics and computation, extending well beyond the issue of how much work is required to erase a bit. In this paper I review some of this recent work on the `stochastic thermodynamics of computation. After reviewing the salient parts of information theory, computer science theory, and stochastic thermodynamics, I summarize what has been learned about the entropic costs of performing a broad range of computations, extending from bit erasure to loop-free circuits to logically reversible circuits to information ratchets to Turing machines. These results reveal new, challenging engineering problems for how to design computers to have minimal thermodynamic costs. They also allow us to start to combine computer science theory and stochastic thermodynamics at a foundational level, thereby expanding both.
I give a quick overview of some of the theoretical background necessary for using modern non-equilibrium statistical physics to investigate the thermodynamics of computation. I first present some of the necessary concepts from information theory, and then introduce some of the most important types of computational machine considered in computer science theory. After this I present a central result from modern non-equilibrium statistical physics: an exact expression for the entropy flow out of a system undergoing a given dynamics with a given initial distribution over states. This central expression is crucial for analyzing how the total entropy flow out of a computer depends on its global structure, since that global structure determines the initial distributions into all of the computers subsystems, and therefore (via the central expression) the entropy flows generated by all of those subsystems. I illustrate these results by analyzing some of the subtleties concerning the benefits that are sometimes claimed for implementing an irreversible computation with a reversible circuit constructed out of Fredkin gates.
We apply the stochastic thermodynamics formalism to describe the dynamics of systems of complex Langevin and Fokker-Planck equations. We provide in particular a simple and general recipe to calculate thermodynamical currents, dissipated and propagating heat for networks of nonlinear oscillators. By using the Hodge decomposition of thermodynamical forces and fluxes, we derive a formula for entropy production that generalises the notion of non-potential forces and makes trans- parent the breaking of detailed balance and of time reversal symmetry for states arbitrarily far from equilibrium. Our formalism is then applied to describe the off-equilibrium thermodynamics of a few examples, notably a continuum ferromagnet, a network of classical spin-oscillators and the Frenkel-Kontorova model of nano friction.
Thermodynamic observables of mesoscopic systems can be expressed as integrated empirical currents. Their fluctuations are bound by thermodynamic uncertainty relations. We introduce the hyperaccurate current as the integrated empirical current with the least fluctuations in a given non-equilibrium system. For steady-state systems described by overdamped Langevin equations, we derive an equation for the hyperaccurate current by means of a variational principle. We show that the hyperaccurate current coincides with the entropy production if and only if the latter saturates the thermodynamic uncertainty relation, and it can be substantially more precise otherwise. The hyperaccurate current can be used to improve estimates of entropy production from experimental data.
We show that the fraction of time a thermodynamic current spends above its average value follows the arcsine law, a prominent result obtained by Levy for Brownian motion. Stochastic currents with long streaks above or below their average are much more likely than those that spend similar fractions of time above and below their average. Our result is confirmed with experimental data from a Brownian Carnot engine. We also conjecture that two other random times associated with currents obey the arcsine law: the time a current reaches its maximum value and the last time a current crosses its average value. These results apply to, inter alia, molecular motors, quantum dots and colloidal systems.
In stochastic thermodynamics standard concepts from macroscopic thermodynamics, such as heat, work, and entropy production, are generalized to small fluctuating systems by defining them on a trajectory-wise level. In Langevin systems with continuous state-space such definitions involve stochastic integrals along system trajectories, whose specific values depend on the discretization rule used to evaluate them (i.e. the interpretation of the noise terms in the integral). Via a systematic mathematical investigation of this apparent dilemma, we corroborate the widely used standard interpretation of heat- and work-like functionals as Stratonovich integrals. We furthermore recapitulate the anomalies that are known to occur for entropy production in the presence of temperature gradients.