ﻻ يوجد ملخص باللغة العربية
Advances in sampling schemes for Markov jump processes have recently enabled multiple inferential tasks. However, in statistical and machine learning applications, we often require that these continuous-time models find support on structured and infinite state spaces. In these cases, exact sampling may only be achieved by often inefficient particle filtering procedures, and rapidly augmenting observed datasets remains a significant challenge. Here, we build on the principles of uniformization and present a tractable framework to address this problem, which greatly improves the efficiency of existing state-of-the-art methods commonly used in small finite-state systems, and further scales their use to infinite-state scenarios. We capitalize on the marginal role of variable subsets in a model hierarchy during the process jumps, and describe an algorithm that relies on measurable mappings between pairs of states and carefully designed sets of synthetic jump observations. The proposed method enables the efficient integration of slice sampling techniques and it can overcome the existing computational bottleneck. We offer evidence by means of experiments addressing inference and clustering tasks on both simulated and real data sets.
We introduce the UPG package for highly efficient Bayesian inference in probit, logit, multinomial logit and binomial logit models. UPG offers a convenient estimation framework for balanced and imbalanced data settings where sampling efficiency is en
We consider Particle Gibbs (PG) as a tool for Bayesian analysis of non-linear non-Gaussian state-space models. PG is a Monte Carlo (MC) approximation of the standard Gibbs procedure which uses sequential MC (SMC) importance sampling inside the Gibbs
This survey covers state-of-the-art Bayesian techniques for the estimation of mixtures. It complements the earlier Marin, Mengersen and Robert (2005) by studying new types of distributions, the multinomial, latent class and t distributions. It also e
We study the class of state-space models and perform maximum likelihood estimation for the model parameters. We consider a stochastic approximation expectation-maximization (SAEM) algorithm to maximize the likelihood function with the novelty of usin
Modeling binary and categorical data is one of the most commonly encountered tasks of applied statisticians and econometricians. While Bayesian methods in this context have been available for decades now, they often require a high level of familiarit