ترغب بنشر مسار تعليمي؟ اضغط هنا

Markovian Statistics on Evolving Systems

194   0   0.0 ( 0 )
 نشر من قبل Ulrich Faigle
 تاريخ النشر 2017
  مجال البحث الاحصاء الرياضي
والبحث باللغة English




اسأل ChatGPT حول البحث

A novel framework for the analysis of observation statistics on time discrete linear evolutions in Banach space is presented. The model differs from traditional models for stochastic processes and, in particular, clearly distinguishes between the deterministic evolution of a system and the stochastic nature of observations on the evolving system. General Markov chains are defined in this context and it is shown how typical traditional models of classical or quantum random walks and Markov processes fit into the framework and how a theory of quantum statistics ({it sensu} Barndorff-Nielsen, Gill and Jupp) may be developed from it. The framework permits a general theory of joint observability of two or more observation variables which may be viewed as an extension of the Heisenberg uncertainty principle and, in particular, offers a novel mathematical perspective on the violation of Bells inequalities in quantum models. Main results include a general sampling theorem relative to Riesz evolution operators in the spirit of von Neumanns mean ergodic theorem for normal operators in Hilbert space.



قيم البحث

اقرأ أيضاً

In [10], a `Markovian stick-breaking process which generalizes the Dirichlet process $(mu, theta)$ with respect to a discrete base space ${mathfrak X}$ was introduced. In particular, a sample from from the `Markovian stick-breaking processs may be re presented in stick-breaking form $sum_{igeq 1} P_i delta_{T_i}$ where ${T_i}$ is a stationary, irreducible Markov chain on ${mathfrak X}$ with stationary distribution $mu$, instead of i.i.d. ${T_i}$ each distributed as $mu$ as in the Dirichlet case, and ${P_i}$ is a GEM$(theta)$ residual allocation sequence. Although the motivation in [10] was to relate these Markovian stick-breaking processes to empirical distributional limits of types of simulated annealing chains, these processes may also be thought of as a class of priors in statistical problems. The aim of this work in this context is to identify the posterior distribution and to explore the role of the Markovian structure of ${T_i}$ in some inference test cases.
243 - Karl Mosler 2012
In 1975 John Tukey proposed a multivariate median which is the deepest point in a given data cloud in R^d. Later, in measuring the depth of an arbitrary point z with respect to the data, David Donoho and Miriam Gasko considered hyperplanes through z and determined its depth by the smallest portion of data that are separated by such a hyperplane. Since then, these ideas has proved extremely fruitful. A rich statistical methodology has developed that is based on data depth and, more general, nonparametric depth statistics. General notions of data depth have been introduced as well as many special ones. These notions vary regarding their computability and robustness and their sensitivity to reflect asymmetric shapes of the data. According to their different properties they fit to particular applications. The upper level sets of a depth statistic provide a family of set-valued statistics, named depth-trimmed or central regions. They describe the distribution regarding its location, scale and shape. The most central region serves as a median. The notion of depth has been extended from data clouds, that is empirical distributions, to general probability distributions on R^d, thus allowing for laws of large numbers and consistency results. It has also been extended from d-variate data to data in functional spaces.
The last decades have seen an unprecedented increase in the availability of data sets that are inherently global and temporally evolving, from remotely sensed networks to climate model ensembles. This paper provides a view of statistical modeling tec hniques for space-time processes, where space is the sphere representing our planet. In particular, we make a distintion between (a) second order-based, and (b) practical approaches to model temporally evolving global processes. The former are based on the specification of a class of space-time covariance functions, with space being the two-dimensional sphere. The latter are based on explicit description of the dynamics of the space-time process, i.e., by specifying its evolution as a function of its past history with added spatially dependent noise. We especially focus on approach (a), where the literature has been sparse. We provide new models of space-time covariance functions for random fields defined on spheres cross time. Practical approaches, (b), are also discussed, with special emphasis on models built directly on the sphere, without projecting the spherical coordinate on the plane. We present a case study focused on the analysis of air pollution from the 2015 wildfires in Equatorial Asia, an event which was classified as the years worst environmental disaster. The paper finishes with a list of the main theoretical and applied research problems in the area, where we expect the statistical community to engage over the next decade.
We study the existence, strong consistency and asymptotic normality of estimators obtained from estimating functions, that are p-dimensional martingale transforms. The problem is motivated by the analysis of evolutionary clustered data, with distribu tions belonging to the exponential family, and which may also vary in terms of other component series. Within a quasi-likelihood approach, we construct estimating equations, which accommodate different forms of dependency among the components of the response vector and establish multivariate extensions of results on linear and generalized linear models, with stochastic covariates. Furthermore, we characterize estimating functions which are asymptotically optimal, in that they lead to confidence regions for the regression parameters which are of minimum size, asymptotically. Results from a simulation study and an application to a real dataset are included.
188 - Satoshi Aoki 2016
In this paper, we introduce the fundamental notion of a Markov basis, which is one of the first connections between commutative algebra and statistics. The notion of a Markov basis is first introduced by Diaconis and Sturmfels (1998) for conditional testing problems on contingency tables by Markov chain Monte Carlo methods. In this method, we make use of a connected Markov chain over the given conditional sample space to estimate the P-values numerically for various conditional tests. A Markov basis plays an importance role in this arguments, because it guarantees the connectivity of the chain, which is needed for unbiasedness of the estimate, for arbitrary conditional sample space. As another important point, a Markov basis is characterized as generators of the well-specified toric ideals of polynomial rings. This connection between commutative algebra and statistics is the main result of Diaconis and Sturmfels (1998). After this first paper, a Markov basis is studied intensively by many researchers both in commutative algebra and statistics, which yields an attractive field called computational algebraic statistics. In this paper, we give a review of the Markov chain Monte Carlo methods for contingency tables and Markov bases, with some fundamental examples. We also give some computational examples by algebraic software Macaulay2 and statistical software R. Readers can also find theoretical details of the problems considered in this paper and various results on the structure and examples of Markov bases in Aoki, Hara and Takemura (2012).
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا