ترغب بنشر مسار تعليمي؟ اضغط هنا

Inferring the input parameters of simulators from observations is a crucial challenge with applications from epidemiology to molecular dynamics. Here we show a simple approach in the regime of sparse data and approximately correct models, which is co mmon when trying to use an existing model to infer latent variables with observed data. This approach is based on the principle of maximum entropy (MaxEnt) and provably makes the smallest change in the latent joint distribution to fit new data. This method requires no likelihood or model derivatives and its fit is insensitive to prior strength, removing the need to balance observed data fit with prior belief. The method requires the ansatz that data is fit in expectation, which is true in some settings and may be reasonable in all with few data points. The method is based on sample reweighting, so its asymptotic run time is independent of prior distribution dimension. We demonstrate this MaxEnt approach and compare with other likelihood-free inference methods across three systems: a point particle moving in a gravitational field, a compartmental model of epidemic spread and finally molecular dynamics simulation of a protein.
Different types of synchronization states are found when non-linear chemical oscillators are embedded into an active medium that interconnects the oscillators but also contributes to the system dynamics. Using different theoretical tools, we approach this problem in order to describe the transition between two such synchronized states. Bifurcation and continuation analysis provide a full description of the parameter space. Phase approximation modeling allows the calculation of the oscillator periods and the bifurcation point.
The ongoing COVID-19 pandemic has created a global crisis of massive scale. Prior research indicates that human mobility is one of the key factors involved in viral spreading. Indeed, in a connected planet, rapid world-wide spread is enabled by long- distance air-, land- and sea-transportation among countries and continents, and subsequently fostered by commuting trips within densely populated cities. While early travel restrictions contribute to delayed disease spread, their utility is much reduced if the disease has a long incubation period or if there is asymptomatic transmission. Given the lack of vaccines, public health officials have mainly relied on non-pharmaceutical interventions, including social distancing measures, curfews, and stay-at-home orders. Here we study the impact of city organization on its susceptibility to disease spread, and amenability to interventions. Cities can be classified according to their mobility in a spectrum between compact-hierarchical and decentralized-sprawled. Our results show that even though hierarchical cities are more susceptible to the rapid spread of epidemics, their organization makes mobility restrictions quite effective. Conversely, sprawled cities are characterized by a much slower initial spread, but are less responsive to mobility restrictions. These findings hold globally across cities in diverse geographical locations and a broad range of sizes. Our empirical measurements are confirmed by a simulation of COVID-19 spread in urban areas through a compartmental model. These results suggest that investing resources on early monitoring and prompt ad-hoc interventions in more vulnerable cities may prove most helpful in containing and reducing the impact of present and future pandemics.
To mitigate the SARS-CoV-2 pandemic, officials have employed social distancing and stay-at-home measures, with increased attention to room ventilation emerging only more recently. Effective distancing practices for open spaces can be ineffective for poorly ventilated spaces, both of which are commonly filled with turbulent air. This is typical for indoor spaces that use mixing ventilation. While turbulence initially reduces the risk of infection near a virion-source, it eventually increases the exposure risk for all occupants in a space without ventilation. To complement detailed models aimed at precision, minimalist frameworks are useful to facilitate order of magnitude estimates for how much ventilation provides safety, particularly when circumstances require practical decisions with limited options. Applying basic principles of transport and diffusion, we estimate the time-scale for virions injected into a room of turbulent air to infect an occupant, distinguishing cases of low vs. high initial virion mass loads and virion-destroying vs. virion-reflecting walls. We consider the effect of an open window as a proxy for ventilation. When the airflow is dominated by isotropic turbulence, the minimum area needed to ensure safety depends only on the ratio of total viral load to threshold load for infection. The minimalist estimates here convey simply that the equivalent of ventilation by modest sized open window in classrooms and workplaces significantly improves safety.
Starting from our recent chemical master equation derivation of the model of an autocatalytic reaction-diffusion chemical system with reactions $U+2V {stackrel {lambda_0}{rightarrow}}~ 3 V;$ and $V {stackrel {mu}{rightarrow}}~P$, $U {stackrel { u}{ri ghtarrow}}~ Q$, we determine the effects of intrinsic noise on the momentum-space behavior of its kinetic parameters and chemical concentrations. We demonstrate that the intrinsic noise induces $n rightarrow n$ molecular interaction processes with $n geq 4$, where $n$ is the number of molecules participating of type $U$ or $V$. The momentum dependences of the reaction rates are driven by the fact that the autocatalytic reaction (inelastic scattering) is renormalized through the existence of an arbitrary number of intermediate elastic scatterings, which can also be interpreted as the creation and subsequent decay of a three body composite state $sigma = phi_u phi_v^2$, where $phi_i$ corresponds to the fields representing the densities of $U$ and $V$. Finally, we discuss the difference between representing $sigma$ as a composite or an elementary particle (molecule) with its own kinetic parameters. In one dimension we find that while they show markedly different behavior in the short spatio-temporal scale, high momentum (UV) limit, they are formally equivalent in the large spatio-temporal scale, low momentum (IR) regime. On the other hand in two dimensions and greater, due to the effects of fluctuations, there is no way to experimentally distinguish between a fundamental and composite $sigma$. Thus in this regime $sigma$ behave as an entity unto itself suggesting that it can be effectively treated as an independent chemical species.
63 - Fred Cooper , Gourab Ghoshal , 2013
We give a first principles derivation of the stochastic partial differential equations that describe the chemical reactions of the Gray-Scott model (GS): $U+2V {stackrel {lambda}{rightarrow}} 3 V;$ and $V {stackrel {mu}{rightarrow}} P$, $U {stackrel { u}{rightarrow}} Q$, with a constant feed rate for $U$. We find that the conservation of probability ensured by the chemical master equation leads to a modification of the usual differential equations for the GS model which now involves two composite fields and also intrinsic noise terms. One of the composites is $psi_1 = phi_v^2$, where $ < phi_v >_{eta} = v$ is the concentration of the species $V$ and the averaging is over the internal noise $eta_{u,v,psi_1}$. The second composite field is the product of three fields $ chi = lambda phi_u phi_v^2$ and requires a noise source to ensure probability conservation. A third composite $psi_2 = phi_{u} phi_{v}$ can be also be identified from the noise-induced reactions. The Hamiltonian that governs the time evolution of the many-body wave function, associated with the master equation, has a broken U(1) symmetry related to particle number conservation. By expanding around the (broken symmetry) zero energy solution of the Hamiltonian (by performing a Doi shift) one obtains from our path integral formulation the usual reaction diffusion equation, at the classical level. The Langevin equations that are derived from the chemical master equation have multiplicative noise sources for the density fields $phi_u, phi_v, chi$ that induce higher order processes such as $n rightarrow n$ scattering for $n > 3$. The amplitude of the noise acting on $ phi_v$ is itself stochastic in nature.
In the last few years we have witnessed the emergence, primarily in on-line communities, of new types of social networks that require for their representation more complex graph structures than have been employed in the past. One example is the folks onomy, a tripartite structure of users, resources, and tags -- labels collaboratively applied by the users to the resources in order to impart meaningful structure on an otherwise undifferentiated database. Here we propose a mathematical model of such tripartite structures which represents them as random hypergraphs. We show that it is possible to calculate many properties of this model exactly in the limit of large network size and we compare the results against observations of a real folksonomy, that of the on-line photography web site Flickr. We show that in some cases the model matches the properties of the observed network well, while in others there are significant differences, which we find to be attributable to the practice of multiple tagging, i.e., the application by a single user of many tags to one resource, or one tag to many resources.
There has been a considerable amount of interest in recent years on the robustness of networks to failures. Many previous studies have concentrated on the effects of node and edge removals on the connectivity structure of a static network; the networ ks are considered to be static in the sense that no compensatory measures are allowed for recovery of the original structure. Real world networks such as the world wide web, however, are not static and experience a considerable amount of turnover, where nodes and edges are both added and deleted. Considering degree-based node removals, we examine the possibility of preserving networks from these types of disruptions. We recover the original degree distribution by allowing the network to react to the attack by introducing new nodes and attaching their edges via specially tailored schemes. We focus particularly on the case of non-uniform failures, a subject that has received little attention in the context of evolving networks. Using a combination of analytical techniques and numerical simulations, we demonstrate how to preserve the exact degree distribution of the studied networks from various forms of attack.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا