Do you want to publish a course? Click here

Overview of Information Theory, Computer Science Theory, and Stochastic Thermodynamics for Thermodynamics of Computation

81   0   0.0 ( 0 )
 Added by David Wolpert
 Publication date 2018
  fields Physics
and research's language is English




Ask ChatGPT about the research

I give a quick overview of some of the theoretical background necessary for using modern non-equilibrium statistical physics to investigate the thermodynamics of computation. I first present some of the necessary concepts from information theory, and then introduce some of the most important types of computational machine considered in computer science theory. After this I present a central result from modern non-equilibrium statistical physics: an exact expression for the entropy flow out of a system undergoing a given dynamics with a given initial distribution over states. This central expression is crucial for analyzing how the total entropy flow out of a computer depends on its global structure, since that global structure determines the initial distributions into all of the computers subsystems, and therefore (via the central expression) the entropy flows generated by all of those subsystems. I illustrate these results by analyzing some of the subtleties concerning the benefits that are sometimes claimed for implementing an irreversible computation with a reversible circuit constructed out of Fredkin gates.



rate research

Read More

66 - David H. Wolpert 2019
One of the major resource requirements of computers - ranging from biological cells to human brains to high-performance (engineered) computers - is the energy used to run them. Those costs of performing a computation have long been a focus of research in physics, going back to the early work of Landauer. One of the most prominent aspects of computers is that they are inherently nonequilibrium systems. However, the early research was done when nonequilibrium statistical physics was in its infancy, which meant the work was formulated in terms of equilibrium statistical physics. Since then there have been major breakthroughs in nonequilibrium statistical physics, which are allowing us to investigate the myriad aspects of the relationship between statistical physics and computation, extending well beyond the issue of how much work is required to erase a bit. In this paper I review some of this recent work on the `stochastic thermodynamics of computation. After reviewing the salient parts of information theory, computer science theory, and stochastic thermodynamics, I summarize what has been learned about the entropic costs of performing a broad range of computations, extending from bit erasure to loop-free circuits to logically reversible circuits to information ratchets to Turing machines. These results reveal new, challenging engineering problems for how to design computers to have minimal thermodynamic costs. They also allow us to start to combine computer science theory and stochastic thermodynamics at a foundational level, thereby expanding both.
One of the primary motivations of the research in the field of computation is to optimize the cost of computation. The major ingredient that a computer needs is the energy to run a process, i.e., the thermodynamic cost. The analysis of the thermodynamic cost of computation is one of the prime focuses of research. It started back since the seminal work of Landauer where it was commented that the computer spends kB T ln2 amount of energy to erase a bit of information (here T is the temperature of the system and kB represents the Boltzmanns constant). The advancement of statistical mechanics has provided us the necessary tool to understand and analyze the thermodynamic cost for the complicated processes that exist in nature, even the computation of modern computers. The advancement of physics has helped us to understand the connection of the statistical mechanics (the thermodynamics cost) with computation. Another important factor that remains a matter of concern in the field of computer science is the error correction of the error that occurs while transmitting the information through a communication channel. Here in this article, we have reviewed the progress of the thermodynamics of computation starting from Landauers principle to the latest model, which simulates the modern complex computation mechanism. After exploring the salient parts of computation in computer science theory and information theory, we have reviewed the thermodynamic cost of computation and error correction. We have also discussed about the alternative computation models that have been proposed with thermodynamically cost-efficient.
We apply the stochastic thermodynamics formalism to describe the dynamics of systems of complex Langevin and Fokker-Planck equations. We provide in particular a simple and general recipe to calculate thermodynamical currents, dissipated and propagating heat for networks of nonlinear oscillators. By using the Hodge decomposition of thermodynamical forces and fluxes, we derive a formula for entropy production that generalises the notion of non-potential forces and makes trans- parent the breaking of detailed balance and of time reversal symmetry for states arbitrarily far from equilibrium. Our formalism is then applied to describe the off-equilibrium thermodynamics of a few examples, notably a continuum ferromagnet, a network of classical spin-oscillators and the Frenkel-Kontorova model of nano friction.
Assuming time-scale separation, a simple and unified theory of thermodynamics and stochastic thermodynamics is constructed for small classical systems strongly interacting with its environment in a controllable fashion. The total Hamiltonian is decomposed into a bath part and a system part, the latter being the Hamiltonian of mean force. Both the conditional equilibrium of bath and the reduced equilibrium of the system are described by canonical ensemble theories with respect to their own Hamiltonians. The bath free energy is independent of the system variables and the control parameter. Furthermore, the weak coupling theory of stochastic thermodynamics becomes applicable almost verbatim, even if the interaction and correlation between the system and its environment are strong and varied externally. Finally, this TSS-based approach also leads to some new insights about the origin of the second law of thermodynamics.
Thermodynamic observables of mesoscopic systems can be expressed as integrated empirical currents. Their fluctuations are bound by thermodynamic uncertainty relations. We introduce the hyperaccurate current as the integrated empirical current with the least fluctuations in a given non-equilibrium system. For steady-state systems described by overdamped Langevin equations, we derive an equation for the hyperaccurate current by means of a variational principle. We show that the hyperaccurate current coincides with the entropy production if and only if the latter saturates the thermodynamic uncertainty relation, and it can be substantially more precise otherwise. The hyperaccurate current can be used to improve estimates of entropy production from experimental data.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا