No Arabic abstract
We investigate the performance of majority-logic decoding in both reversible and finite-time information erasure processes performed on macroscopic bits that contain $N$ microscopic binary units. While we show that for reversible erasure protocols single-unit transformations are more efficient than majority-logic decoding, the latter is found to offer several benefits for finite-time erasure processes: Both the minimal erasure duration for a given erasure and the minimal erasure error for a given erasure duration are reduced, if compared to a single unit. Remarkably, the majority-logic decoding is also more efficient in both the small erasure error and fast erasure region. These benefits are also preserved under the optimal erasure protocol that minimizes the dissipated heat. Our work therefore shows that majority-logic decoding can lift the precision-speed-efficiency trade-off in information erasure processes.
We present an experiment in which a one-bit memory is constructed, using a system of a single colloidal particle trapped in a modulated double-well potential. We measure the amount of heat dissipated to erase a bit and we establish that in the limit of long erasure cycles the mean dissipated heat saturates at the Landauer bound, i.e. the minimal quantity of heat necessarily produced to delete a classical bit of information. This result demonstrates the intimate link between information theory and thermodynamics. To stress this connection we also show that a detailed Jarzynski equality is verified, retrieving the Landauers bound independently of the work done on the system. The experimental details are presented and the experimental errors carefully discussed
Rudolph (1967) introduced one-step majority logic decoding for linear codes derived from combinatorial designs. The decoder is easily realizable in hardware and requires that the dual code has to contain the blocks of so called geometric designs as codewords. Peterson and Weldon (1972) extended Rudolphs algorithm to a two-step majority logic decoder correcting the same number of errors than Reeds celebrated multi-step majority logic decoder. Here, we study the codes from subspace designs. It turns out that these codes have the same majority logic decoding capability as the codes from geometric designs, but their majority logic decoding complexity is sometimes drastically improved.
I give a quick overview of some of the theoretical background necessary for using modern non-equilibrium statistical physics to investigate the thermodynamics of computation. I first present some of the necessary concepts from information theory, and then introduce some of the most important types of computational machine considered in computer science theory. After this I present a central result from modern non-equilibrium statistical physics: an exact expression for the entropy flow out of a system undergoing a given dynamics with a given initial distribution over states. This central expression is crucial for analyzing how the total entropy flow out of a computer depends on its global structure, since that global structure determines the initial distributions into all of the computers subsystems, and therefore (via the central expression) the entropy flows generated by all of those subsystems. I illustrate these results by analyzing some of the subtleties concerning the benefits that are sometimes claimed for implementing an irreversible computation with a reversible circuit constructed out of Fredkin gates.
The Glosten-Milgrom model describes a single asset market, where informed traders interact with a market maker, in the presence of noise traders. We derive an analogy between this financial model and a Szilard information engine by {em i)} showing that the optimal work extraction protocol in the latter coincides with the pricing strategy of the market maker in the former and {em ii)} defining a market analogue of the physical temperature from the analysis of the distribution of market orders. Then we show that the expected gain of informed traders is bounded above by the product of this market temperature with the amount of information that informed traders have, in exact analogy with the corresponding formula for the maximal expected amount of work that can be extracted from a cycle of the information engine. This suggests that recent ideas from information thermodynamics may shed light on financial markets, and lead to generalised inequalities, in the spirit of the extended second law of thermodynamics.
We investigate majority rule dynamics in a population with two classes of people, each with two opinion states $pm 1$, and with tunable interactions between people in different classes. In an update, a randomly selected group adopts the majority opinion if all group members belong to the same class; if not, majority rule is applied with probability $epsilon$. Consensus is achieved in a time that scales logarithmically with population size if $epsilongeq epsilon_c=frac{1}{9}$. For $epsilon <epsilon_c$, the population can get trapped in a polarized state, with one class preferring the $+1$ state and the other preferring $-1$. The time to escape this polarized state and reach consensus scales exponentially with population size.