No Arabic abstract
In this paper, we introduce the fundamental notion of a Markov basis, which is one of the first connections between commutative algebra and statistics. The notion of a Markov basis is first introduced by Diaconis and Sturmfels (1998) for conditional testing problems on contingency tables by Markov chain Monte Carlo methods. In this method, we make use of a connected Markov chain over the given conditional sample space to estimate the P-values numerically for various conditional tests. A Markov basis plays an importance role in this arguments, because it guarantees the connectivity of the chain, which is needed for unbiasedness of the estimate, for arbitrary conditional sample space. As another important point, a Markov basis is characterized as generators of the well-specified toric ideals of polynomial rings. This connection between commutative algebra and statistics is the main result of Diaconis and Sturmfels (1998). After this first paper, a Markov basis is studied intensively by many researchers both in commutative algebra and statistics, which yields an attractive field called computational algebraic statistics. In this paper, we give a review of the Markov chain Monte Carlo methods for contingency tables and Markov bases, with some fundamental examples. We also give some computational examples by algebraic software Macaulay2 and statistical software R. Readers can also find theoretical details of the problems considered in this paper and various results on the structure and examples of Markov bases in Aoki, Hara and Takemura (2012).
We review a finite-sampling exponential bound due to Serfling and discuss related exponential bounds for the hypergeometric distribution. We then discuss how such bounds motivate some new results for two-sample empirical processes. Our development complements recent results by Wei and Dudley (2011) concerning exponential bounds for two-sided Kolmogorov - Smirnov statistics by giving corresponding results for one-sided statistics with emphasis on adjusted inequalities of the type proved originally by Dvoretzky, Kiefer, and Wolfowitz (1956) and by Massart (1990) for one-samp
We present an introduction to the theory of algebraic geometry codes. Starting from evaluation codes and codes from order and weight functions, special attention is given to one-point codes and, in particular, to the family of Castle codes.
In 1975 John Tukey proposed a multivariate median which is the deepest point in a given data cloud in R^d. Later, in measuring the depth of an arbitrary point z with respect to the data, David Donoho and Miriam Gasko considered hyperplanes through z and determined its depth by the smallest portion of data that are separated by such a hyperplane. Since then, these ideas has proved extremely fruitful. A rich statistical methodology has developed that is based on data depth and, more general, nonparametric depth statistics. General notions of data depth have been introduced as well as many special ones. These notions vary regarding their computability and robustness and their sensitivity to reflect asymmetric shapes of the data. According to their different properties they fit to particular applications. The upper level sets of a depth statistic provide a family of set-valued statistics, named depth-trimmed or central regions. They describe the distribution regarding its location, scale and shape. The most central region serves as a median. The notion of depth has been extended from data clouds, that is empirical distributions, to general probability distributions on R^d, thus allowing for laws of large numbers and consistency results. It has also been extended from d-variate data to data in functional spaces.
The project of Greenlees et al. on understanding rational G-spectra in terms of algebraic categories has had many successes, classifying rational G-spectra for finite groups, SO(2), O(2), SO(3), free and cofree G-spectra as well as rational toral G-spectra for arbitrary compact Lie groups. This paper provides an introduction to the subject in two parts. The first discusses rational G-Mackey functors, the action of the Burnside ring and change of group functors. It gives a complete proof of the well-known classification of rational Mackey functors for finite G. The second part discusses the methods and tools from equivariant stable homotopy theory needed to obtain algebraic models for rational G-spectra. It gives a summary of the key steps in the classification of rational G-spectrain terms of a symmetric monoidal algebraic category. Having these two parts in the same place allows one to clearly see the analogy between the algebraic and topological classifications.
In the statistical inference for long range dependent time series the shape of the limit distribution typically depends on unknown parameters. Therefore, we propose to use subsampling. We show the validity of subsampling for general statistics and long range dependent subordinated Gaussian processes which satisfy mild regularity conditions. We apply our method to a self-normalized change-point test statistic so that we can test for structural breaks in long range dependent time series without having to estimate any nuisance parameter. The finite sample properties are investigated in a simulation study. We analyze three data sets and compare our results to the conclusions of other authors.