No Arabic abstract
We show that the fraction of time a thermodynamic current spends above its average value follows the arcsine law, a prominent result obtained by Levy for Brownian motion. Stochastic currents with long streaks above or below their average are much more likely than those that spend similar fractions of time above and below their average. Our result is confirmed with experimental data from a Brownian Carnot engine. We also conjecture that two other random times associated with currents obey the arcsine law: the time a current reaches its maximum value and the last time a current crosses its average value. These results apply to, inter alia, molecular motors, quantum dots and colloidal systems.
Fractional Brownian motion is a non-Markovian Gaussian process indexed by the Hurst exponent $Hin [0,1]$, generalising standard Brownian motion to account for anomalous diffusion. Functionals of this process are important for practical applications as a standard reference point for non-equilibrium dynamics. We describe a perturbation expansion allowing us to evaluate many non-trivial observables analytically: We generalize the celebrated three arcsine-laws of standard Brownian motion. The functionals are: (i) the fraction of time the process remains positive, (ii) the time when the process last visits the origin, and (iii) the time when it achieves its maximum (or minimum). We derive expressions for the probability of these three functionals as an expansion in $epsilon = H-tfrac{1}{2}$, up to second order. We find that the three probabilities are different, except for $H=tfrac{1}{2}$ where they coincide. Our results are confirmed to high precision by numerical simulations.
We present a stochastic thermodynamics analysis of an electron-spin-resonance pumped quantum dot device in the Coulomb-blocked regime, where a pure spin current is generated without an accompanying net charge current. Based on a generalized quantum master equation beyond secular approximation, quantum coherences are accounted for in terms of an effective average spin in the Floquet basis. Elegantly, this effective spin undergoes a precession about an effective magnetic field, which originates from the non-secular treatment and energy renormalization. It is shown that the interaction between effective spin and effective magnetic field may have the dominant roles to play in both energy transport and irreversible entropy production. In the stationary limit, the energy and entropy balance relations are also established based on the theory of counting statistics.
Thermodynamic observables of mesoscopic systems can be expressed as integrated empirical currents. Their fluctuations are bound by thermodynamic uncertainty relations. We introduce the hyperaccurate current as the integrated empirical current with the least fluctuations in a given non-equilibrium system. For steady-state systems described by overdamped Langevin equations, we derive an equation for the hyperaccurate current by means of a variational principle. We show that the hyperaccurate current coincides with the entropy production if and only if the latter saturates the thermodynamic uncertainty relation, and it can be substantially more precise otherwise. The hyperaccurate current can be used to improve estimates of entropy production from experimental data.
One of the major resource requirements of computers - ranging from biological cells to human brains to high-performance (engineered) computers - is the energy used to run them. Those costs of performing a computation have long been a focus of research in physics, going back to the early work of Landauer. One of the most prominent aspects of computers is that they are inherently nonequilibrium systems. However, the early research was done when nonequilibrium statistical physics was in its infancy, which meant the work was formulated in terms of equilibrium statistical physics. Since then there have been major breakthroughs in nonequilibrium statistical physics, which are allowing us to investigate the myriad aspects of the relationship between statistical physics and computation, extending well beyond the issue of how much work is required to erase a bit. In this paper I review some of this recent work on the `stochastic thermodynamics of computation. After reviewing the salient parts of information theory, computer science theory, and stochastic thermodynamics, I summarize what has been learned about the entropic costs of performing a broad range of computations, extending from bit erasure to loop-free circuits to logically reversible circuits to information ratchets to Turing machines. These results reveal new, challenging engineering problems for how to design computers to have minimal thermodynamic costs. They also allow us to start to combine computer science theory and stochastic thermodynamics at a foundational level, thereby expanding both.
In stochastic thermodynamics standard concepts from macroscopic thermodynamics, such as heat, work, and entropy production, are generalized to small fluctuating systems by defining them on a trajectory-wise level. In Langevin systems with continuous state-space such definitions involve stochastic integrals along system trajectories, whose specific values depend on the discretization rule used to evaluate them (i.e. the interpretation of the noise terms in the integral). Via a systematic mathematical investigation of this apparent dilemma, we corroborate the widely used standard interpretation of heat- and work-like functionals as Stratonovich integrals. We furthermore recapitulate the anomalies that are known to occur for entropy production in the presence of temperature gradients.