No Arabic abstract
Biological sensory systems react to changes in their surroundings. They are characterized by fast response and slow adaptation to varying environmental cues. Insofar as sensory adaptive systems map environmental changes to changes of their internal degrees of freedom, they can be regarded as computational devices manipulating information. Landauer established that information is ultimately physical, and its manipulation subject to the entropic and energetic bounds of thermodynamics. Thus the fundamental costs of biological sensory adaptation can be elucidated by tracking how the information the system has about its environment is altered. These bounds are particularly relevant for small organisms, which unlike everyday computers operate at very low energies. In this paper, we establish a general framework for the thermodynamics of information processing in sensing. With it, we quantify how during sensory adaptation information about the past is erased, while information about the present is gathered. This process produces entropy larger than the amount of old information erased and has an energetic cost bounded by the amount of new information written to memory. We apply these principles to the E. colis chemotaxis pathway during binary ligand concentration changes. In this regime, we quantify the amount of information stored by each methyl group and show that receptors consume energy in the range of the information-theoretic minimum. Our work provides a basis for further inquiries into more complex phenomena, such as gradient sensing and frequency response.
Efficient protein synthesis depends on the availability of charged tRNA molecules. With 61 different codons, shifting the balance among the tRNA abundances can lead to large changes in the protein synthesis rate. Previous theoretical work has asked about the optimization of these abundances, and there is some evidence that regulatory mechanisms bring cells close to this optimum, on average. We formulate the tradeoff between the precision of control and the efficiency of synthesis, asking for the maximum entropy distribution of tRNA abundances consistent with a desired mean rate of protein synthesis. Our analysis, using data from E. coli, indicates that reasonable synthesis rates are consistent only with rather low entropies, so that the cells regulatory mechanisms must encode a large amount of information about the correct tRNA abundances.
Reaction currents in chemical networks usually increase when increasing their driving affinities. But far from equilibrium the opposite can also happen. We find that such negative differential response (NDR) occurs in reaction schemes of major biological relevance, namely, substrate inhibition and autocatalysis. We do so by deriving the full counting statistics of two minimal representative models using large deviation methods. We argue that NDR implies the existence of optimal affinities that maximize the robustness against environmental and intrinsic noise at intermediate values of dissipation. An analogous behavior is found in dissipative self-assembly, for which we identify the optimal working conditions set by NDR.
We investigate the influence of intrinsic noise on stable states of a one-dimensional dynamical system that shows in its deterministic version a saddle-node bifurcation between monostable and bistable behaviour. The system is a modified version of the Schlogl model, which is a chemical reaction system with only one type of molecule. The strength of the intrinsic noise is varied without changing the deterministic description by introducing bursts in the autocatalytic production step. We study the transitions between monostable and bistable behavior in this system by evaluating the number of maxima of the stationary probability distribution. We find that changing the size of bursts can destroy and even induce saddle-node bifurcations. This means that a bursty production of molecules can qualitatively change the dynamics of a chemical reaction system even when the deterministic description remains unchanged.
Synchronization among arrays of beating cilia is one of the emergent phenomena in biological processes at meso-scopic scales. Strong inter-ciliary couplings modify the natural beating frequencies, $omega$, of individual cilia to produce a collective motion that moves around a group frequency $omega_m$. Here we study the thermodynamic cost of synchronizing cilia arrays by mapping their dynamics onto a generic phase oscillator model. The model suggests that upon synchronization the mean heat dissipation rate is decomposed into two contributions, dissipation from each ciliums own natural driving force and dissipation arising from the interaction with other cilia, the latter of which can be interpreted as the one produced by a potential with a time-dependent protocol in the framework of our model. The spontaneous phase-synchronization of beating dynamics of cilia induced by strong inter-ciliary coupling is always accompanied with a significant reduction of dissipation for the cilia population, suggesting that organisms as a whole expend less energy by attaining a temporal order. At the level of individual cilia, however, a population of cilia with $|omega|< omega_m$ expend more amount of energy upon synchronization.
A central result that arose in applying information theory to the stochastic thermodynamics of nonlinear dynamical systems is the Information-Processing Second Law (IPSL): the physical entropy of the universe can decrease if compensated by the Shannon-Kolmogorov-Sinai entropy change of appropriate information-carrying degrees of freedom. In particular, the asymptotic-rate IPSL precisely delineates the thermodynamic functioning of autonomous Maxwellian demons and information engines. How do these systems begin to function as engines, Landauer erasers, and error correctors? Here, we identify a minimal, inescapable transient dissipation engendered by physical information processing not captured by asymptotic rates, but critical to adaptive thermodynamic processes such as found in biological systems. A component of transient dissipation, we also identify an implementation-dependent cost that varies from one physical substrate to another for the same information processing task. Applying these results to producing structured patterns from a structureless information reservoir, we show that retrodictive generators achieve the minimal costs. The results establish the thermodynamic toll imposed by a physical systems structure as it comes to optimally transduce information.