No Arabic abstract
The ribosome is one of the largest and most complex macromolecular machines in living cells. It polymerizes a protein in a step-by-step manner as directed by the corresponding nucleotide sequence on the template messenger RNA (mRNA) and this process is referred to as `translation of the genetic message encoded in the sequence of mRNA transcript. In each successful chemo-mechanical cycle during the (protein) elongation stage, the ribosome elongates the protein by a single subunit, called amino acid, and steps forward on the template mRNA by three nucleotides called a codon. Therefore, a ribosome is also regarded as a molecular motor for which the mRNA serves as the track, its step size is that of a codon and two molecules of GTP and one molecule of ATP hydrolyzed in that cycle serve as its fuel. What adds further complexity is the existence of competing pathways leading to distinct cycles, branched pathways in each cycle and futile consumption of fuel that leads neither to elongation of the nascent protein nor forward stepping of the ribosome on its track. We investigate a model formulated in terms of the network of discrete chemo-mechanical states of a ribosome during the elongation stage of translation. The model is analyzed using a combination of stochastic thermodynamic and kinetic analysis based on a graph-theoretic approach. We derive the exact solution of the corresponding master equations. We represent the steady state in terms of the cycles of the underlying network and discuss the energy transduction processes. We identify the various possible modes of operation of a ribosome in terms of its average velocity and mean rate of GTP hydrolysis. We also compute entropy production as functions of the rates of the interstate transitions and the thermodynamic cost for accuracy of the translation process.
Assuming time-scale separation, a simple and unified theory of thermodynamics and stochastic thermodynamics is constructed for small classical systems strongly interacting with its environment in a controllable fashion. The total Hamiltonian is decomposed into a bath part and a system part, the latter being the Hamiltonian of mean force. Both the conditional equilibrium of bath and the reduced equilibrium of the system are described by canonical ensemble theories with respect to their own Hamiltonians. The bath free energy is independent of the system variables and the control parameter. Furthermore, the weak coupling theory of stochastic thermodynamics becomes applicable almost verbatim, even if the interaction and correlation between the system and its environment are strong and varied externally. Finally, this TSS-based approach also leads to some new insights about the origin of the second law of thermodynamics.
We experimentally study a piezoelectric energy harvester driven by broadband random vibrations. We show that a linear model, consisting of an underdamped Langevin equation for the dynamics of the tip mass, electromechanically coupled with a capacitor and a load resistor, can accurately describe the experimental data. In particular, the theoretical model allows us to define fluctuating currents and to study the stochastic thermodynamics of the system, with focus on the distribution of the extracted work over different time intervals. Our analytical and numerical analysis of the linear model is successfully compared to the experiments
One of the major resource requirements of computers - ranging from biological cells to human brains to high-performance (engineered) computers - is the energy used to run them. Those costs of performing a computation have long been a focus of research in physics, going back to the early work of Landauer. One of the most prominent aspects of computers is that they are inherently nonequilibrium systems. However, the early research was done when nonequilibrium statistical physics was in its infancy, which meant the work was formulated in terms of equilibrium statistical physics. Since then there have been major breakthroughs in nonequilibrium statistical physics, which are allowing us to investigate the myriad aspects of the relationship between statistical physics and computation, extending well beyond the issue of how much work is required to erase a bit. In this paper I review some of this recent work on the `stochastic thermodynamics of computation. After reviewing the salient parts of information theory, computer science theory, and stochastic thermodynamics, I summarize what has been learned about the entropic costs of performing a broad range of computations, extending from bit erasure to loop-free circuits to logically reversible circuits to information ratchets to Turing machines. These results reveal new, challenging engineering problems for how to design computers to have minimal thermodynamic costs. They also allow us to start to combine computer science theory and stochastic thermodynamics at a foundational level, thereby expanding both.
Many cell functions are accomplished thanks to intracellular transport mechanisms of macromolecules along filaments. Molecular motors such as dynein or kinesin are proteins playing a primary role in these processes. The behavior of such proteins is quite well understood when there is only one of them moving a cargo particle. Indeed, numerous in vitro experiments have been performed to derive accurate models for a single molecular motor. However, in vivo macromolecules are often carried by multiple motors. The main focus of this paper is to provide an analysis of the behavior of more molecular motors interacting together in order to improve the understanding of their actual physiological behavior. Previous studies provide analyses based on results obtained from Monte Carlo simulations. Different from these studies, we derive an equipollent probabilistic model to describe the dynamics of multiple proteins coupled together and provide an exact theoretical analysis. We are capable of obtaining the probability density function of the motor protein configurations, thus enabling a deeper understanding of their behavior.
Stochastic resetting, a diffusive process whose amplitude is reset to the origin at random times, is a vividly studied strategy to optimize encounter dynamics, e.g., in chemical reactions. We here generalize the resetting step by introducing a random resetting amplitude, such that the diffusing particle may be only partially reset towards the trajectory origin, or even overshoot the origin in a resetting step. We introduce different scenarios for the random-amplitude stochastic resetting process and discuss the resulting dynamics. Direct applications are geophysical layering (stratigraphy) as well as population dynamics or financial markets, as well as generic search processes.