Do you want to publish a course? Click here

Network reconstruction based on quasi-steady state data

235   0   0.0 ( 0 )
 Added by Eduardo D. Sontag
 Publication date 2007
  fields Biology
and research's language is English




Ask ChatGPT about the research

This note discusses a theoretical issue regarding the application of the Modular Response Analysis method to quasi-steady state (rather than steady-state) data.



rate research

Read More

299 - Lu Xie 2012
As a widely used method in metabolic network studies, Monte-Carlo sampling in the steady state flux space is known for its flexibility and convenience of carrying out different purposes, simply by alternating constraints or objective functions, or appending post processes. Recently the concept of a non-linear constraint based on the second thermodynamic law, known as Loop Law, is challenging current sampling algorithms which will inevitably give rise to the internal loops. A generalized method is proposed here to eradicate the probability of the appearance of internal loops during sampling process. Based on Artificial Centered Hit and Run (ACHR) method, each step of the new sampling process will avoid entering loop-forming subspaces. This method has been applied on the metabolic network of Helicobacter pylori with three different objective functions: uniform sampling, optimizing biomass synthesis, optimizing biomass synthesis efficiency over resources ingested. Comparison between results from the new method and conventional ACHR method shows effective elimination of loop fluxes without affecting non-loop fluxes.
68 - G. Burbidge 2001
A brief historical account of modern cosmology shows that the standard big bang (BB) model, believed by so many, does not have the strong observational foundations that are frequently claimed for it. The theory of the Quasi-Steady State Cosmology (QSSC) and explosive cosmogony is outlined. Comparisons are made between the two theories in explaining the observed properties of the universe, namely, the expansion, chemical composition, CMB, QSO redshifts and explosive events, galaxy formation, and the m-z and theta-z relations. Only two of the observed properties have ever been predicted from the theories (a) the expansion predicted from Einsteins theory by Friedmann and Lemaitre, and (b) the acceleration predicted by the classical steady state theory and the QSSC.
The development of single-cell technologies provides the opportunity to identify new cellular states and reconstruct novel cell-to-cell relationships. Applications range from understanding the transcriptional and epigenetic processes involved in metazoan development to characterizing distinct cells types in heterogeneous populations like cancers or immune cells. However, analysis of the data is impeded by its unknown intrinsic biological and technical variability together with its sparseness; these factors complicate the identification of true biological signals amidst artifact and noise. Here we show that, across technologies, roughly 95% of the eigenvalues derived from each single-cell data set can be described by universal distributions predicted by Random Matrix Theory. Interestingly, 5% of the spectrum shows deviations from these distributions and present a phenomenon known as eigenvector localization, where information tightly concentrates in groups of cells. Some of the localized eigenvectors reflect underlying biological signal, and some are simply a consequence of the sparsity of single cell data; roughly 3% is artifactual. Based on the universal distributions and a technique for detecting sparsity induced localization, we present a strategy to identify the residual 2% of directions that encode biological information and thereby denoise single-cell data. We demonstrate the effectiveness of this approach by comparing with standard single-cell data analysis techniques in a variety of examples with marked cell populations.
In biochemical networks, reactions often occur on disparate timescales and can be characterized as either fast or slow. The quasi-steady state approximation (QSSA) utilizes timescale separation to project models of biochemical networks onto lower-dimensional slow manifolds. As a result, fast elementary reactions are not modeled explicitly, and their effect is captured by non-elementary reaction rate functions (e.g. Hill functions). The accuracy of the QSSA applied to deterministic systems depends on how well timescales are separated. Recently, it has been proposed to use the non-elementary rate functions obtained via the deterministic QSSA to define propensity functions in stochastic simulations of biochemical networks. In this approach, termed the stochastic QSSA, fast reactions that are part of non-elementary reactions are not simulated, greatly reducing computation time. However, it is unclear when the stochastic QSSA provides an accurate approximation of the original stochastic simulation. We show that, unlike the deterministic QSSA, the validity of the stochastic QSSA does not follow from timescale separation alone, but also depends on the sensitivity of the non-elementary reaction rate functions to changes in the slow species. The stochastic QSSA becomes more accurate when this sensitivity is small. Different types of QSSAs result in non-elementary functions with different sensitivities, and the total QSSA results in less sensitive functions than the standard or the pre-factor QSSA. We prove that, as a result, the stochastic QSSA becomes more accurate when non-elementary reaction functions are obtained using the total QSSA. Our work provides a novel condition for the validity of the QSSA in stochastic simulations of biochemical reaction networks with disparate timescales.
Biopolymer gels, such as those made out of fibrin or collagen, are widely used in tissue engineering applications and biomedical research. Moreover, fibrin naturally assembles into gels in vivo during wound healing and thrombus formation. Macroscale biopolymer gel mechanics are dictated by the microscale fiber network. Hence, accurate description of biopolymer gels can be achieved using representative volume elements (RVE) that explicitly model the discrete fiber networks of the microscale. These RVE models, however, cannot be efficiently used to model the macroscale due to the challenges and computational demands of multiscale coupling. Here, we propose the use of an artificial, fully connected neural network (FCNN) to efficiently capture the behavior of the RVE models. The FCNN was trained on 1100 fiber networks subjected to 121 biaxial deformations. The stress data from the RVE, together with the total energy and the condition of incompressibility of the surrounding matrix, were used to determine the derivatives of an unknown strain energy function with respect to the deformation invariants. During training, the loss function was modified to ensure convexity of the strain energy function and symmetry of its Hessian. A general FCNN model was coded into a user material subroutine (UMAT) in the software Abaqus. In this work, the FCNN trained on the discrete fiber network data was used in finite element simulations of fibrin gels using our UMAT. We anticipate that this work will enable further integration of machine learning tools with computational mechanics. It will also improve computational modeling of biological materials characterized by a multiscale structure.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا