Do you want to publish a course? Click here

A One-Sample Test for Normality with Kernel Methods

285   0   0.0 ( 0 )
 Added by Jeremie Kellner
 Publication date 2015
and research's language is English




Ask ChatGPT about the research

We propose a new one-sample test for normality in a Reproducing Kernel Hilbert Space (RKHS). Namely, we test the null-hypothesis of belonging to a given family of Gaussian distributions. Hence our procedure may be applied either to test data for normality or to test parameters (mean and covariance) if data are assumed Gaussian. Our test is based on the same principle as the MMD (Maximum Mean Discrepancy) which is usually used for two-sample tests such as homogeneity or independence testing. Our method makes use of a special kind of parametric bootstrap (typical of goodness-of-fit tests) which is computationally more efficient than standard parametric bootstrap. Moreover, an upper bound for the Type-II error highlights the dependence on influential quantities. Experiments illustrate the practical improvement allowed by our test in high-dimensional settings where common normality tests are known to fail. We also consider an application to covariance rank selection through a sequential procedure.



rate research

Read More

265 - Jeremie Kellner 2014
A new goodness-of-fit test for normality in high-dimension (and Reproducing Kernel Hilbert Space) is proposed. It shares common ideas with the Maximum Mean Discrepancy (MMD) it outperforms both in terms of computation time and applicability to a wider range of data. Theoretical results are derived for the Type-I and Type-II errors. They guarantee the control of Type-I error at prescribed level and an exponentially fast decrease of the Type-II error. Synthetic and real data also illustrate the practical improvement allowed by our test compared with other leading approaches in high-dimensional settings.
In this paper, a novel Bayesian nonparametric test for assessing multivariate normal models is presented. While there are extensive frequentist and graphical methods for testing multivariate normality, it is challenging to find Bayesian counterparts. The proposed approach is based on the use of the Dirichlet process and Mahalanobis distance. More precisely, the Mahalanobis distance is employed as a good technique to transform the $m$-variate problem into a univariate problem. Then the Dirichlet process is used as a prior on the distribution of the Mahalanobis distance. The concentration of the distribution of the distance between the posterior process and the chi-square distribution with $m$ degrees of freedom is compared to the concentration of the distribution of the distance between the prior process and the chi-square distribution with $m$ degrees of freedom via a relative belief ratio. The distance between the Dirichlet process and the chi-square distribution is established based on the Anderson-Darling distance. Key theoretical results of the approach are derived. The procedure is illustrated through several examples, in which the proposed approach shows excellent performance.
79 - A.J. van Es , H.-W. Uh 2001
We derive asymptotic normality of kernel type deconvolution estimators of the density, the distribution function at a fixed point, and of the probability of an interval. We consider the so called super smooth case where the characteristic function of the known distribution decreases exponentially. It turns out that the limit behavior of the pointwise estimators of the density and distribution function is relatively straightforward while the asymptotics of the estimator of the probability of an interval depends in a complicated way on the sequence of bandwidths.
362 - Jiexiang Li 2014
The paper discusses the estimation of a continuous density function of the target random field $X_{bf{i}}$, $bf{i}in mathbb {Z}^N$ which is contaminated by measurement errors. In particular, the observed random field $Y_{bf{i}}$, $bf{i}in mathbb {Z}^N$ is such that $Y_{bf{i}}=X_{bf{i}}+epsilon_{bf{i}}$, where the random error $epsilon_{bf{i}}$ is from a known distribution and independent of the target random field. Compared to the existing results, the paper is improved in two directions. First, the random vectors in contrast to univariate random variables are investigated. Second, a random field with a certain spatial interactions instead of i. i. d. random variables is studied. Asymptotic normality of the proposed estimator is established under appropriate conditions.
97 - A.J. van Es , H.-W. Uh 2002
We derive asymptotic normality of kernel type deconvolution density estimators. In particular we consider deconvolution problems where the known component of the convolution has a symmetric lambda-stable distribution, 0<lambda<= 2. It turns out that the limit behavior changes if the exponent parameter lambda passes the value one, the case of Cauchy deconvolution.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا