ﻻ يوجد ملخص باللغة العربية
We present a general law of the iterated logarithm for stochastic processes on the open unit interval having subexponential tails in a locally uniform fashion. It applies to standard Brownian bridge but also to suitably standardized empirical distribution functions. This leads to new goodness-of-fit tests and confidence bands which refine the procedures of Berk and Jones (1979) and Owen (1995). Roughly speaking, the high power and accuracy of the latter procedures in the tail regions of distributions are essentially preserved while gaining considerably in the central region.
In this note we prove the following law of the iterated logarithm for the Grenander estimator of a monotone decreasing density: If $f(t_0) > 0$, $f(t_0) < 0$, and $f$ is continuous in a neighborhood of $t_0$, then begin{eqnarray*} limsup_{nrightarrow
Two-sample tests have been one of the most classical topics in statistics with wide application even in cutting edge applications. There are at least two modes of inference used to justify the two-sample tests. One is usual superpopulation inference
We study the law of the iterated logarithm (LIL) for the maximum likelihood estimation of the parameters (as a convex optimization problem) in the generalized linear models with independent or weakly dependent ($rho$-mixing, $m$-dependent) responses
In the nonparametric Gaussian sequence space model an $ell^2$-confidence ball $C_n$ is constructed that adapts to unknown smoothness and Sobolev-norm of the infinite-dimensional parameter to be estimated. The confidence ball has exact and honest asym
We study high-dimensional linear models with error-in-variables. Such models are motivated by various applications in econometrics, finance and genetics. These models are challenging because of the need to account for measurement errors to avoid non-