Do you want to publish a course? Click here

Irreversible Samplers from Jump and Continuous Markov Processes

143   0   0.0 ( 0 )
 Added by Yi-An Ma
 Publication date 2016
and research's language is English




Ask ChatGPT about the research

In this paper, we propose irreversib



rate research

Read More

This paper describes the structure of solutions to Kolmogorovs equations for nonhomogeneous jump Markov processes and applications of these results to control of jump stochastic systems. These equations were studied by Feller (1940), who clarified in 1945 in the errata to that paper that some of its results covered only nonexplosive Markov processes. We present the results for possibly explosive Markov processes. The paper is based on the invited talk presented by the authors at the International Conference dedicated to the 200th anniversary of the birth of P. L.~Chebyshev.
This paper extends to Continuous-Time Jump Markov Decision Processes (CTJMDP) the classic result for Markov Decision Processes stating that, for a given initial state distribution, for every policy there is a (randomized) Markov policy, which can be defined in a natural way, such that at each time instance the marginal distributions of state-action pairs for these two policies coincide. It is shown in this paper that this equality takes place for a CTJMDP if the corresponding Markov policy defines a nonexplosive jump Markov process. If this Markov process is explosive, then at each time instance the marginal probability, that a state-action pair belongs to a measurable set of state-action pairs, is not greater for the described Markov policy than the same probability for the original policy. These results are used in this paper to prove that for expected discounted total costs and for average costs per unit time, for a given initial state distribution, for each policy for a CTJMDP the described a Markov policy has the same or better performance.
This paper introduces a novel methodology for the identification of switching dynamics for switched autoregressive linear models. Switching behavior is assumed to follow a Markov model. The systems outputs are contaminated by possibly large values of measurement noise. Although the procedure provided can handle other noise distributions, for simplicity, it is assumed that the distribution is Normal with unknown variance. Given noisy input-output data, we aim at identifying switched system coefficients, parameters of the noise distribution, dynamics of switching and probability transition matrix of Markovian model. System dynamics are estimated using previous results which exploit algebraic constraints that system trajectories have to satisfy. Switching dynamics are computed with solving a maximum likelihood estimation problem. The efficiency of proposed approach is shown with several academic examples. Although the noise to output ratio can be high, the method is shown to be extremely effective in the situations where a large number of measurements is available.
We develop clustering procedures for longitudinal trajectories based on a continuous-time hidden Markov model (CTHMM) and a generalized linear observation model. Specifically in this paper, we carry out finite and infinite mixture model-based clustering for a CTHMM and achieve inference using Markov chain Monte Carlo (MCMC). For a finite mixture model with prior on the number of components, we implement reversible-jump MCMC to facilitate the trans-dimensional move between different number of clusters. For a Dirichlet process mixture model, we utilize restricted Gibbs sampling split-merge proposals to expedite the MCMC algorithm. We employ proposed algorithms to the simulated data as well as a real data example, and the results demonstrate the desired performance of the new sampler.
Time series datasets often contain heterogeneous signals, composed of both continuously changing quantities and discretely occurring events. The coupling between these measurements may provide insights into key underlying mechanisms of the systems under study. To better extract this information, we investigate the asymptotic statistical properties of coupling measures between continuous signals and point processes. We first introduce martingale stochastic integration theory as a mathematical model for a family of statistical quantities that include the Phase Locking Value, a classical coupling measure to characterize complex dynamics. Based on the martingale Central Limit Theorem, we can then derive the asymptotic Gaussian distribution of estimates of such coupling measure, that can be exploited for statistical testing. Second, based on multivariate extensions of this result and Random Matrix Theory, we establish a principled way to analyze the low rank coupling between a large number of point processes and continuous signals. For a null hypothesis of no coupling, we establish sufficient conditions for the empirical distribution of squared singular values of the matrix to converge, as the number of measured signals increases, to the well-known Marchenko-Pastur (MP) law, and the largest squared singular value converges to the upper end of the MPs support. This justifies a simple thresholding approach to assess the significance of multivariate coupling. Finally, we illustrate with simulations the relevance of our univariate and multivariate results in the context of neural time series, addressing how to reliably quantify the interplay between multi channel Local Field Potential signals and the spiking activity of a large population of neurons.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا