ترغب بنشر مسار تعليمي؟ اضغط هنا

Transients generate memory and break hyperbolicity in stochastic enzymatic networks

83   0   0.0 ( 0 )
 نشر من قبل Arti Dua
 تاريخ النشر 2020
  مجال البحث علم الأحياء فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

The hyperbolic dependence of catalytic rate on substrate concentration is a classical result in enzyme kinetics, quantified by the celebrated Michaelis-Menten equation. The ubiquity of this relation in diverse chemical and biological contexts has recently been rationalized by a graph-theoretic analysis of deterministic reaction networks. Experiments, however, have revealed that molecular noise - intrinsic stochasticity at the molecular scale - leads to significant deviations from classical results and to unexpected effects like molecular memory, i.e., the breakdown of statistical independence between turnover events. Here we show, through a new method of analysis, that memory and non-hyperbolicity have a common source in an initial, and observably long, transient peculiar to stochastic reaction networks of multiple enzymes. Networks of single enzymes do not admit such transients. The transient yields, asymptotically, to a steady-state in which memory vanishes and hyperbolicity is recovered. We propose new statistical measures, defined in terms of turnover times, to distinguish between the transient and steady states and apply these to experimental data from a landmark experiment that first observed molecular memory in a single enzyme with multiple binding sites. Our study shows that catalysis at the molecular level with more than one enzyme always contains a non-classical regime and provides insight on how the classical limit is attained.



قيم البحث

اقرأ أيضاً

We report the first study of a network of connected enzyme-catalyzed reactions, with added chemical and enzymatic processes that incorporate the recently developed biochemical filtering steps into the functioning of this biocatalytic cascade. New the oretical expressions are derived to allow simple, few-parameter modeling of network components concatenated in such cascades, both with and without filtering. The derived expressions are tested against experimental data obtained for the realized networks responses, measured optically, to variations of its input chemicals concentrations with and without filtering processes. We also describe how the present modeling approach captures and explains several observations and features identified in earlier studies of enzymatic processes when they were considered as potential network components for multi-step information/signal processing systems.
During the last decade, network approaches became a powerful tool to describe protein structure and dynamics. Here, we describe first the protein structure networks of molecular chaperones, then characterize chaperone containing sub-networks of inter actomes called as chaperone-networks or chaperomes. We review the role of molecular chaperones in short-term adaptation of cellular networks in response to stress, and in long-term adaptation discussing their putative functions in the regulation of evolvability. We provide a general overview of possible network mechanisms of adaptation, learning and memory formation. We propose that changes of network rigidity play a key role in learning and memory formation processes. Flexible network topology provides learning competent state. Here, networks may have much less modular boundaries than locally rigid, highly modular networks, where the learnt information has already been consolidated in a memory formation process. Since modular boundaries are efficient filters of information, in the learning competent state information filtering may be much smaller, than after memory formation. This mechanism restricts high information transfer to the learning competent state. After memory formation, modular boundary-induced segregation and information filtering protect the stored information. The flexible networks of young organisms are generally in a learning competent state. On the contrary, locally rigid networks of old organisms have lost their learning competent state, but store and protect their learnt information efficiently. We anticipate that the above mechanism may operate at the level of both protein-protein interaction and neuronal networks.
We introduce a minimal model for the evolution of functional protein-interaction networks using a sequence-based mutational algorithm, and apply the model to study neutral drift in networks that yield oscillatory dynamics. Starting with a functional core module, random evolutionary drift increases network complexity even in the absence of specific selective pressures. Surprisingly, we uncover a hidden order in sequence space that gives rise to long-term evolutionary memory, implying strong constraints on network evolution due to the topology of accessible sequence space.
Information transmission in biological signaling circuits has often been described using the metaphor of a noise filter. Cellular systems need accurate, real-time data about their environmental conditions, but the biochemical reaction networks that p ropagate, amplify, and process signals work with noisy representations of that data. Biology must implement strategies that not only filter the noise, but also predict the current state of the environment based on information delayed due to the finite speed of chemical signaling. The idea of a biochemical noise filter is actually more than just a metaphor: we describe recent work that has made an explicit mathematical connection between signaling fidelity in cellular circuits and the classic theories of optimal noise filtering and prediction that began with Wiener, Kolmogorov, Shannon, and Bode. This theoretical framework provides a versatile tool, allowing us to derive analytical bounds on the maximum mutual information between the environmental signal and the real-time estimate constructed by the system. It helps us understand how the structure of a biological network, and the response times of its components, influences the accuracy of that estimate. The theory also provides insights into how evolution may have tuned enzyme kinetic parameters and populations to optimize information transfer.
Are turn-on and turn-off functions in protein-protein interaction networks exact opposites of each other? To answer this question, we implement a minimal model for the evolution of functional protein-interaction networks using a sequence-based mutati onal algorithm, and apply the model to study neutral drift in networks that yield oscillatory dynamics. We study the roles of activators and deactivators, two core components of oscillatory protein interaction networks, and find a striking asymmetry in the roles of activating and deactivating proteins, where activating proteins tend to be synergistic and deactivating proteins tend to be competitive.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا