Do you want to publish a course? Click here

Approximation and inference methods for stochastic biochemical kinetics - a tutorial review

124   0   0.0 ( 0 )
 Added by David Schnoerr
 Publication date 2016
  fields Biology Physics
and research's language is English




Ask ChatGPT about the research

Stochastic fluctuations of molecule numbers are ubiquitous in biological systems. Important examples include gene expression and enzymatic processes in living cells. Such systems are typically modelled as chemical reaction networks whose dynamics are governed by the Chemical Master Equation. Despite its simple structure, no analytic solutions to the Chemical Master Equation are known for most systems. Moreover, stochastic simulations are computationally expensive, making systematic analysis and statistical inference a challenging task. Consequently, significant effort has been spent in recent decades on the development of efficient approximation and inference methods. This article gives an introduction to basic modelling concepts as well as an overview of state of the art methods. First, we motivate and introduce deterministic and stochastic methods for modelling chemical networks, and give an overview of simulation and exact solution methods. Next, we discuss several approximation methods, including the chemical Langevin equation, the system size expansion, moment closure approximations, time-scale separation approximations and hybrid methods. We discuss their various properties and review recent advances and remaining challenges for these methods. We present a comparison of several of these methods by means of a numerical case study and highlight some of their respective advantages and disadvantages. Finally, we discuss the problem of inference from experimental data in the Bayesian framework and review recent methods developed the literature. In summary, this review gives a self-contained introduction to modelling, approximations and inference methods for stochastic chemical kinetics.



rate research

Read More

Biochemical reaction networks frequently consist of species evolving on multiple timescales. Stochastic simulations of such networks are often computationally challenging and therefore various methods have been developed to obtain sensible stochastic approximations on the timescale of interest. One of the rigorous and popular approaches is the multiscale approximation method for continuous time Markov processes. In this approach, by scaling species abundances and reaction rates, a family of processes parameterized by a scaling parameter is defined. The limiting process of this family is then used to approximate the original process. However, we find that such approximations become inaccurate when combinations of species with disparate abundances either constitute conservation laws or form virtual slow auxiliary species. To obtain more accurate approximation in such cases, we propose here an appropriate modification of the original method.
Stochasticity is an indispensable aspect of biochemical processes at the cellular level. Studies on how the noise enters and propagates in biochemical systems provided us with nontrivial insights into the origins of stochasticity, in total however they constitute a patchwork of different theoretical analyses. Here we present a flexible and generally applicable noise decomposition tool, that allows us to calculate contributions of individual reactions to the total variability of a systems output. With the package it is therefore possible to quantify how the noise enters and propagates in biochemical systems. We also demonstrate and exemplify using the JAK-STAT signalling pathway that it is possible to infer noise contributions resulting from individual reactions directly from experimental data. This is the first computational tool that allows to decompose noise into contributions resulting from individual reactions.
By developing and leveraging an explicit molecular realisation of a measurement-and-feedback-powered Szilard engine, we investigate the extraction of work from complex environments by minimal machines with finite capacity for memory and decision-making. Living systems perform inference to exploit complex structure, or correlations, in their environment, but the physical limits and underlying cost/benefit trade-offs involved in doing so remain unclear. To probe these questions, we consider a minimal model for a structured environment - a correlated sequence of molecules - and explore mechanisms based on extended Szilard engines for extracting the work stored in these non-equilibrium correlations. We consider systems limited to a single bit of memory making binary choices at each step. We demonstrate that increasingly complex environments allow increasingly sophisticated inference strategies to extract more energy than simpler alternatives, and argue that optimal design of such machines should also consider the energy reserves required to ensure robustness against fluctuations due to mistakes.
We study the stochastic kinetics of a signaling module consisting of a two-state stochastic point process with negative feedback. In the active state, a product is synthesized which increases the active-to-inactive transition rate of the process. We analyze this simple autoregulatory module using a path-integral technique based on the temporal statistics of state flips of the process. We develop a systematic framework to calculate averages, autocorrelations, and response functions by treating the feedback as a weak perturbation. Explicit analytical results are obtained to first order in the feedback strength. Monte Carlo simulations are performed to test the analytical results in the weak feedback limit and to investigate the strong feedback regime. We conclude by relating some of our results to experimental observations in the olfactory and visual sensory systems.
We develop a theoretical approach that uses physiochemical kinetics modelling to describe cell population dynamics upon progression of viral infection in cell culture, which results in cell apoptosis (programmed cell death) and necrosis (direct cell death). Several model parameters necessary for computer simulation were determined by reviewing and analyzing available published experimental data. By comparing experimental data to computer modelling results, we identify the parameters that are the most sensitive to the measured system properties and allow for the best data fitting. Our model allows extraction of parameters from experimental data and also has predictive power. Using the model we describe interesting time-dependent quantities that were not directly measured in the experiment, and identify correlations among the fitted parameter values. Numerical simulation of viral infection progression is done by a rate-equation approach resulting in a system of stiff equations, which are solved by using a novel variant of the stochastic ensemble modelling approach. The latter was originally developed for coupled chemical reactions.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا