ترغب بنشر مسار تعليمي؟ اضغط هنا

NeuTra-lizing Bad Geometry in Hamiltonian Monte Carlo Using Neural Transport

128   0   0.0 ( 0 )
 نشر من قبل Pavel Sountsov
 تاريخ النشر 2019
  مجال البحث الاحصاء الرياضي
والبحث باللغة English




اسأل ChatGPT حول البحث

Hamiltonian Monte Carlo is a powerful algorithm for sampling from difficult-to-normalize posterior distributions. However, when the geometry of the posterior is unfavorable, it may take many expensive evaluations of the target distribution and its gradient to converge and mix. We propose neural transport (NeuTra) HMC, a technique for learning to correct this sort of unfavorable geometry using inverse autoregressive flows (IAF), a powerful neural variational inference technique. The IAF is trained to minimize the KL divergence from an isotropic Gaussian to the warped posterior, and then HMC sampling is performed in the warped space. We evaluate NeuTra HMC on a variety of synthetic and real problems, and find that it significantly outperforms vanilla HMC both in time to reach the stationary distribution and asymptotic effective-sample-size rates.



قيم البحث

اقرأ أيضاً

We present a method for performing Hamiltonian Monte Carlo that largely eliminates sample rejection for typical hyperparameters. In situations that would normally lead to rejection, instead a longer trajectory is computed until a new state is reached that can be accepted. This is achieved using Markov chain transitions that satisfy the fixed point equation, but do not satisfy detailed balance. The resulting algorithm significantly suppresses the random walk behavior and wasted function evaluations that are typically the consequence of update rejection. We demonstrate a greater than factor of two improvement in mixing time on three test problems. We release the source code as Python and MATLAB packages.
Hamiltonian Monte Carlo (HMC) has been widely adopted in the statistics community because of its ability to sample high-dimensional distributions much more efficiently than other Metropolis-based methods. Despite this, HMC often performs sub-optimall y on distributions with high correlations or marginal variances on multiple scales because the resulting stiffness forces the leapfrog integrator in HMC to take an unreasonably small stepsize. We provide intuition as well as a formal analysis showing how these multiscale distributions limit the stepsize of leapfrog and we show how the implicit midpoint method can be used, together with Newton-Krylov iteration, to circumvent this limitation and achieve major efficiency gains. Furthermore, we offer practical guidelines for when to choose between implicit midpoint and leapfrog and what stepsize to use for each method, depending on the distribution being sampled. Unlike previous modifications to HMC, our method is generally applicable to highly non-Gaussian distributions exhibiting multiple scales. We illustrate how our method can provide a dramatic speedup over leapfrog in the context of the No-U-Turn sampler (NUTS) applied to several examples.
151 - Changye Wu 2018
Hamiltonian Monte Carlo samplers have become standard algorithms for MCMC implementations, as opposed to more bas
In this paper, we develop Bayesian Hamiltonian Monte Carlo methods for inference in asymmetric GARCH models under different distributions for the error term. We implemented Zero-variance and Hamiltonian Monte Carlo schemes for parameter estimation to try and reduce the standard errors of the estimates thus obtaing more efficient results at the price of a small extra computational cost.
This paper studies a non-random-walk Markov Chain Monte Carlo method, namely the Hamiltonian Monte Carlo (HMC) method in the context of Subset Simulation used for structural reliability analysis. The HMC method relies on a deterministic mechanism ins pired by Hamiltonian dynamics to propose samples following a target probability distribution. The method alleviates the random walk behavior to achieve a more effective and consistent exploration of the probability space compared to standard Gibbs or Metropolis-Hastings techniques. After a brief review of the basic concepts of the HMC method and its computational details, two algorithms are proposed to facilitate the application of the HMC method to Subset Simulation in structural reliability analysis. Next, the behavior of the two HMC algorithms is illustrated using simple probability distribution models. Finally, the accuracy and efficiency of Subset Simulation employing the two HMC algorithms are tested using various reliability examples. The supporting source code and data are available for download at (the URL that will become available once the paper is accepted).
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا