ترغب بنشر مسار تعليمي؟ اضغط هنا

Separating effect from significance in Markov chain tests

78   0   0.0 ( 0 )
 نشر من قبل Wesley Pegden
 تاريخ النشر 2019
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

We give qualitative and quantitative improvements to theorems which enable significance testing in Markov Chains, with a particular eye toward the goal of enabling strong, interpretable, and statistically rigorous claims of political gerrymandering. Our results can be used to demonstrate at a desired significance level that a given Markov Chain state (e.g., a districting) is extremely unusual (rather than just atypical) with respect to the fragility of its characteristics in the chain. We also provide theorems specialized to leverage quantitative improvements when there is a product structure in the underlying probability space, as can occur due to geographical constraints on districtings.


قيم البحث

اقرأ أيضاً

We consider the connections among `clumped residual allocation models (RAMs), a general class of stick-breaking processes including Dirichlet processes, and the occupation laws of certain discrete space time-inhomogeneous Markov chains related to sim ulated annealing and other applications. An intermediate structure is introduced in a given RAM, where proportions between successive indices in a list are added or clumped together to form another RAM. In particular, when the initial RAM is a Griffiths-Engen-McCloskey (GEM) sequence and the indices are given by the random times that an auxiliary Markov chain jumps away from its current state, the joint law of the intermediate RAM and the locations visited in the sojourns is given in terms of a `disordered GEM sequence, and an induced Markov chain. Through this joint law, we identify a large class of `stick breaking processes as the limits of empirical occupation measures for associated time-inhomogeneous Markov chains.
This review paper provides an introduction of Markov chains and their convergence rates which is an important and interesting mathematical topic which also has important applications for very widely used Markov chain Monte Carlo (MCMC) algorithm. We first discuss eigenvalue analysis for Markov chains on finite state spaces. Then, using the coupling construction, we prove two quantitative bounds based on minorization condition and drift conditions, and provide descriptive and intuitive examples to showcase how these theorems can be implemented in practice. This paper is meant to provide a general overview of the subject and spark interest in new Markov chain research areas.
We prove that moderate deviations for empirical measures for countable nonhomogeneous Markov chains hold under the assumption of uniform convergence of transition probability matrices for countable nonhomogeneous Markov chains in Ces`aro sense.
The approximation of integral functionals with respect to a stationary Markov process by a Riemann-sum estimator is studied. Stationarity and the functional calculus of the infinitesimal generator of the process are used to get a better understanding of the estimation error and to prove a general error bound. The presented approach admits general integrands and gives a unifying explanation for different rates obtained in the literature. Several examples demonstrate how the general bound can be related to well-known function spaces.
Our purpose is to prove central limit theorem for countable nonhomogeneous Markov chain under the condition of uniform convergence of transition probability matrices for countable nonhomogeneous Markov chain in Ces`aro sense. Furthermore, we obtain a corresponding moderate deviation theorem for countable nonhomogeneous Markov chain by Gartner-Ellis theorem and exponential equivalent method.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا