ترغب بنشر مسار تعليمي؟ اضغط هنا

Making the best of a bad situation: a multiscale approach to free energy calculation

64   0   0.0 ( 0 )
 نشر من قبل Michele Invernizzi
 تاريخ النشر 2019
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Many enhanced sampling techniques rely on the identification of a number of collective variables that describe all the slow modes of the system. By constructing a bias potential in this reduced space one is then able to sample efficiently and reconstruct the free energy landscape. In methods like metadynamics, the quality of these collective variables plays a key role in convergence efficiency. Unfortunately in many systems of interest it is not possible to identify an optimal collective variable, and one must deal with the non-ideal situation of a system in which some slow modes are not accelerated. We propose a two-step approach in which, by taking into account the residual multiscale nature of the problem, one is able to significantly speed up convergence. To do so, we combine an exploratory metadynamics run with an optimization of the free energy difference between metastable states, based on the recently proposed variationally enhanced sampling method. This new method is well parallelizable and is especially suited for complex systems, because of its simplicity and clear underlying physical picture.



قيم البحث

اقرأ أيضاً

We introduce the {Destructive Object Handling} (DOH) problem, which models aspects of many real-world allocation problems, such as shipping explosive munitions, scheduling processes in a cluster with fragile nodes, re-using passwords across multiple websites, and quarantining patients during a disease outbreak. In these problems, objects must be assigned to handlers, but each object has a probability of destroying itself and all the other objects allocated to the same handler. The goal is to maximize the expected value of the objects handled successfully. We show that finding the optimal allocation is $mathsf{NP}$-$mathsf{complete}$, even if all the handlers are identical. We present an FPTAS when the number of handlers is constant. We note in passing that the same technique also yields a first FPTAS for the weapons-target allocation problem cite{manne_wta} with a constant number of targets. We study the structure of DOH problems and find that they have a sort of phase transition -- in some instances it is better to spread risk evenly among the handlers, in others, one handler should be used as a ``sacrificial lamb. We show that the problem is solvable in polynomial time if the destruction probabilities depend only on the handler to which an object is assigned; if all the handlers are identical and the objects all have the same value; or if each handler can be assigned at most one object. Finally, we empirically evaluate several heuristics based on a combination of greedy and genetic algorithms. The proposed heuristics return fairly high quality solutions to very large problem instances (upto 250 objects and 100 handlers) in tens of seconds.
208 - David Wilman 2010
Physical processes influencing the properties of galaxies can be traced by the dependence and evolution of galaxy properties on their environment. A detailed understanding of this dependence can only be gained through comparison of observations with models, with an appropriate quantification of the rich parameter space describing the environment of the galaxy. We present a new, multiscale parameterization of galaxy environment which retains an observationally motivated simplicity whilst utilizing the information present on different scales. We examine how the distribution of galaxy (u-r) colours in the Sloan Digital Sky Survey (SDSS), parameterized using a double gaussian (red plus blue peak) fit, depends upon multiscale density. This allows us to probe the detailed dependence of galaxy properties on environment in a way which is independent of the halo model. Nonetheless, cross-correlation with the group catalogue constructed by Yang et al, 2007 shows that galaxy properties trace environment on different scales in a way which mimics that expected within the halo model. This provides independent support for the existence of virialized haloes, and important additional clues to the role played by environment in the evolution of the galaxy population. This work is described in full by Wilman et al., 2010, MNRAS, accepted
A physical model is presented for a semiconductor electrode of a photoelectrochemical (PEC) cell, accounting for the potential drop in the Helmholtz layer. Hence both band edge pinning and unpinning are naturally included in our description. The mode l is based on the continuity equations for charge carriers and direct charge transfer from the energy bands to the electrolyte. A quantitative calculation of the position of the energy bands and the variation of the quasi-Fermi levels in the semiconductor with respect to the water reduction and oxidation potentials is presented. Calculated current-voltage curves are compared with established analytical models and measurement. Our model calculations are suitable to enhance understanding and improve properties of semiconductors for photoelectrochemical water splitting.
We introduce an approach to exploit the existence of multiple levels of description of a physical system to radically accelerate the determination of thermodynamic quantities. We first give a proof of principle of the method using two empirical inter atomic potential functions. We then apply the technique to feed information from an interatomic potential into otherwise inaccessible quantum mechanical tight-binding calculations of the reconstruction of partial dislocations in silicon at finite temperature. With this approach, comprehensive ab initio studies at finite temperature will now be possible.
Large-scale natural language inference (NLI) datasets such as SNLI or MNLI have been created by asking crowdworkers to read a premise and write three new hypotheses, one for each possible semantic relationships (entailment, contradiction, and neutral ). While this protocol has been used to create useful benchmark data, it remains unclear whether the writing-based annotation protocol is optimal for any purpose, since it has not been evaluated directly. Furthermore, there is ample evidence that crowdworker writing can introduce artifacts in the data. We investigate two alternative protocols which automatically create candidate (premise, hypothesis) pairs for annotators to label. Using these protocols and a writing-based baseline, we collect several new English NLI datasets of over 3k examples each, each using a fixed amount of annotator time, but a varying number of examples to fit that time budget. Our experiments on NLI and transfer learning show negative results: None of the alternative protocols outperforms the baseline in evaluations of generalization within NLI or on transfer to outside target tasks. We conclude that crowdworker writing still the best known option for entailment data, highlighting the need for further data collection work to focus on improving writing-based annotation processes.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا