On Bayesian inferential tasks with infinite-state jump processes: efficient data augmentation


Abstract in English

Advances in sampling schemes for Markov jump processes have recently enabled multiple inferential tasks. However, in statistical and machine learning applications, we often require that these continuous-time models find support on structured and infinite state spaces. In these cases, exact sampling may only be achieved by often inefficient particle filtering procedures, and rapidly augmenting observed datasets remains a significant challenge. Here, we build on the principles of uniformization and present a tractable framework to address this problem, which greatly improves the efficiency of existing state-of-the-art methods commonly used in small finite-state systems, and further scales their use to infinite-state scenarios. We capitalize on the marginal role of variable subsets in a model hierarchy during the process jumps, and describe an algorithm that relies on measurable mappings between pairs of states and carefully designed sets of synthetic jump observations. The proposed method enables the efficient integration of slice sampling techniques and it can overcome the existing computational bottleneck. We offer evidence by means of experiments addressing inference and clustering tasks on both simulated and real data sets.

Download