ترغب بنشر مسار تعليمي؟ اضغط هنا

The FMRIB Variational Bayesian Inference Tutorial II: Stochastic Variational Bayes

94   0   0.0 ( 0 )
 نشر من قبل Michael Chappell
 تاريخ النشر 2020
والبحث باللغة English




اسأل ChatGPT حول البحث

Bayesian methods have proved powerful in many applications for the inference of model parameters from data. These methods are based on Bayes theorem, which itself is deceptively simple. However, in practice the computations required are intractable even for simple cases. Hence methods for Bayesian inference have historically either been significantly approximate, e.g., the Laplace approximation, or achieve samples from the exact solution at significant computational expense, e.g., Markov Chain Monte Carlo methods. Since around the year 2000 so-called Variational approaches to Bayesian inference have been increasingly deployed. In its most general form Variational Bayes (VB) involves approximating the true posterior probability distribution via another more manageable distribution, the aim being to achieve as good an approximation as possible. In the original FMRIB Variational Bayes tutorial we documented an approach to VB based that took a mean field approach to forming the approximate posterior, required the conjugacy of prior and likelihood, and exploited the Calculus of Variations, to derive an iterative series of update equations, akin to Expectation Maximisation. In this tutorial we revisit VB, but now take a stochastic approach to the problem that potentially circumvents some of the limitations imposed by the earlier methodology. This new approach bears a lot of similarity to, and has benefited from, computational methods applied to machine learning algorithms. Although, what we document here is still recognisably Bayesian inference in the classic sense, and not an attempt to use machine learning as a black-box to solve the inference problem.

قيم البحث

اقرأ أيضاً

119 - Jean Daunizeau 2017
Variational approaches to approximate Bayesian inference provide very efficient means of performing parameter estimation and model selection. Among these, so-called variational-Laplace or VL schemes rely on Gaussian approximations to posterior densit ies on model parameters. In this note, we review the main variants of VL approaches, that follow from considering nonlinear models of continuous and/or categorical data. En passant, we also derive a few novel theoretical results that complete the portfolio of existing analyses of variational Bayesian approaches, including investigations of their asymptotic convergence. We also suggest practical ways of extending existing VL approaches to hierarchical generative models that include (e.g., precision) hyperparameters.
We develop a variational Bayesian (VB) approach for estimating large-scale dynamic network models in the network autoregression framework. The VB approach allows for the automatic identification of the dynamic structure of such a model and obtains a direct approximation of the posterior density. Compared to Markov Chain Monte Carlo (MCMC) based sampling approaches, the VB approach achieves enhanced computational efficiency without sacrificing estimation accuracy. In the simulation study conducted here, the proposed VB approach detects various types of proper active structures for dynamic network models. Compared to the alternative approach, the proposed method achieves similar or better accuracy, and its computational time is halved. In a real data analysis scenario of day-ahead natural gas flow prediction in the German gas transmission network with 51 nodes between October 2013 and September 2015, the VB approach delivers promising forecasting accuracy along with clearly detected structures in terms of dynamic dependence.
This paper presents studies on a deterministic annealing algorithm based on quantum annealing for variational Bayes (QAVB) inference, which can be seen as an extension of the simulated annealing for variational Bayes (SAVB) inference. QAVB is as easy as SAVB to implement. Experiments revealed QAVB finds a better local optimum than SAVB in terms of the variational free energy in latent Dirichlet allocation (LDA).
This paper presents studies on a deterministic annealing algorithm based on quantum annealing for variational Bayes (QAVB) inference, which can be seen as an extension of the simulated annealing for variational Bayes (SAVB) inference. QAVB is as easy as SAVB to implement. Experiments revealed QAVB finds a better local optimum than SAVB in terms of the variational free energy in latent Dirichlet allocation (LDA).
Variational Bayes (VB) has been used to facilitate the calculation of the posterior distribution in the context of Bayesian inference of the parameters of nonlinear models from data. Previously an analytical formulation of VB has been derived for non linear model inference on data with additive gaussian noise as an alternative to nonlinear least squares. Here a stochastic solution is derived that avoids some of the approximations required of the analytical formulation, offering a solution that can be more flexibly deployed for nonlinear model inference problems. The stochastic VB solution was used for inference on a biexponential toy case and the algorithmic parameter space explored, before being deployed on real data from a magnetic resonance imaging study of perfusion. The new method was found to achieve comparable parameter recovery to the analytic solution and be competitive in terms of computational speed despite being reliant on sampling.

الأسئلة المقترحة

التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا