ﻻ يوجد ملخص باللغة العربية
For large scale on-line inference problems the update strategy is critical for performance. We derive an adaptive scan Gibbs sampler that optimizes the update frequency by selecting an optimum mini-batch size. We demonstrate performance of our adaptive batch-size Gibbs sampler by comparing it against the collapsed Gibbs sampler for Bayesian Lasso, Dirichlet Process Mixture Models (DPMM) and Latent Dirichlet Allocation (LDA) graphical models.
Estimators computed from adaptively collected data do not behave like their non-adaptive brethren. Rather, the sequential dependence of the collection policy can lead to severe distributional biases that persist even in the infinite data limit. We de
In this article, we derive a novel non-reversible, continuous-time Markov chain Monte Carlo (MCMC) sampler, called Coordinate Sampler, based on a piecewise deterministic Markov process (PDMP), which can be seen as a variant of the Zigzag sampler. In
As well as primary fluctuations, CMB temperature maps contain a wealth of additional information in the form of secondary anisotropies. Secondary effects that can be identified with individual objects, such as the thermal and kinetic Sunyaev-Zeldovic
In many fields, researchers are interested in discovering features with substantial effect on the response from a large number of features and controlling the proportion of false discoveries. By incorporating the knockoff procedure in the Bayesian fr
Discrete choice models describe the choices made by decision makers among alternatives and play an important role in transportation planning, marketing research and other applications. The mixed multinomial logit (MMNL) model is a popular discrete ch