ترغب بنشر مسار تعليمي؟ اضغط هنا

Uncertainty and filtering of hidden Markov models in discrete time

81   0   0.0 ( 0 )
 نشر من قبل Samuel Cohen
 تاريخ النشر 2016
  مجال البحث الاحصاء الرياضي
والبحث باللغة English
 تأليف Samuel N. Cohen




اسأل ChatGPT حول البحث

We consider the problem of filtering an unseen Markov chain from noisy observations, in the presence of uncertainty regarding the parameters of the processes involved. Using the theory of nonlinear expectations, we describe the uncertainty in terms of a penalty function, which can be propagated forward in time in the place of the filter.



قيم البحث

اقرأ أيضاً

102 - Andrew L. Allan 2020
We consider the filtering of continuous-time finite-state hidden Markov models, where the rate and observation matrices depend on unknown time-dependent parameters, for which no prior or stochastic model is available. We quantify and analyze how the induced uncertainty may be propagated through time as we collect new observations, and used to simultaneously provide robust estimates of the hidden signal and to learn the unknown parameters, via techniques based on pathwise filtering and new results on the optimal control of rough differential equations.
We develop clustering procedures for longitudinal trajectories based on a continuous-time hidden Markov model (CTHMM) and a generalized linear observation model. Specifically in this paper, we carry out finite and infinite mixture model-based cluster ing for a CTHMM and achieve inference using Markov chain Monte Carlo (MCMC). For a finite mixture model with prior on the number of components, we implement reversible-jump MCMC to facilitate the trans-dimensional move between different number of clusters. For a Dirichlet process mixture model, we utilize restricted Gibbs sampling split-merge proposals to expedite the MCMC algorithm. We employ proposed algorithms to the simulated data as well as a real data example, and the results demonstrate the desired performance of the new sampler.
We consider the problem of flexible modeling of higher order hidden Markov models when the number of latent states and the nature of the serial dependence, including the true order, are unknown. We propose Bayesian nonparametric methodology based on tensor factorization techniques that can characterize any transition probability with a specified maximal order, allowing automated selection of the important lags and capturing higher order interactions among the lags. Theoretical results provide insights into identifiability of the emission distributions and asymptotic behavior of the posterior. We design efficient Markov chain Monte Carlo algorithms for posterior computation. In simulation experiments, the method vastly outperformed its first and higher order competitors not just in higher order settings, but, remarkably, also in first order cases. Practical utility is illustrated using real world applications.
We consider the modeling of data generated by a latent continuous-time Markov jump process with a state space of finite but unknown dimensions. Typically in such models, the number of states has to be pre-specified, and Bayesian inference for a fixed number of states has not been studied until recently. In addition, although approaches to address the problem for discrete-time models have been developed, no method has been successfully implemented for the continuous-time case. We focus on reversible jump Markov chain Monte Carlo which allows the trans-dimensional move among different numbers of states in order to perform Bayesian inference for the unknown number of states. Specifically, we propose an efficient split-combine move which can facilitate the exploration of the parameter space, and demonstrate that it can be implemented effectively at scale. Subsequently, we extend this algorithm to the context of model-based clustering, allowing numbers of states and clusters both determined during the analysis. The model formulation, inference methodology, and associated algorithm are illustrated by simulation studies. Finally, We apply this method to real data from a Canadian healthcare system in Quebec.
This paper gives a method for computing distributions associated with patterns in the state sequence of a hidden Markov model, conditional on observing all or part of the observation sequence. Probabilities are computed for very general classes of pa tterns (competing patterns and generalized later patterns), and thus, the theory includes as special cases results for a large class of problems that have wide application. The unobserved state sequence is assumed to be Markovian with a general order of dependence. An auxiliary Markov chain is associated with the state sequence and is used to simplify the computations. Two examples are given to illustrate the use of the methodology. Whereas the first application is more to illustrate the basic steps in applying the theory, the second is a more detailed application to DNA sequences, and shows that the methods can be adapted to include restrictions related to biological knowledge.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا