ﻻ يوجد ملخص باللغة العربية
Non-Gaussian component analysis (NGCA) is a problem in multidimensional data analysis which, since its formulation in 2006, has attracted considerable attention in statistics and machine learning. In this problem, we have a random variable $X$ in $n$-dimensional Euclidean space. There is an unknown subspace $Gamma$ of the $n$-dimensional Euclidean space such that the orthogonal projection of $X$ onto $Gamma$ is standard multidimensional Gaussian and the orthogonal projection of $X$ onto $Gamma^{perp}$, the orthogonal complement of $Gamma$, is non-Gaussian, in the sense that all its one-dimensional marginals are different from the Gaussian in a certain metric defined in terms of moments. The NGCA problem is to approximate the non-Gaussian subspace $Gamma^{perp}$ given samples of $X$. Vectors in $Gamma^{perp}$ correspond to `interesting directions, whereas vectors in $Gamma$ correspond to the directions where data is very noisy. The most interesting applications of the NGCA model is for the case when the magnitude of the noise is comparable to that of the true signal, a setting in which traditional noise reduction techniques such as PCA dont apply directly. NGCA is also related to dimension reduction and to other data analysis problems such as ICA. NGCA-like problems have been studied in statistics for a long time using techniques such as projection pursuit. We give an algorithm that takes polynomial time in the dimension $n$ and has an inverse polynomial dependence on the error parameter measuring the angle distance between the non-Gaussian subspace and the subspace output by the algorithm. Our algorithm is based on relative entropy as the contrast function and fits under the projection pursuit framework. The techniques we develop for analyzing our algorithm maybe of use for other related problems.
The entropy of a quantum system is a measure of its randomness, and has applications in measuring quantum entanglement. We study the problem of measuring the von Neumann entropy, $S(rho)$, and Renyi entropy, $S_alpha(rho)$ of an unknown mixed quantum
Functional principal component analysis (FPCA) could become invalid when data involve non-Gaussian features. Therefore, we aim to develop a general FPCA method to adapt to such non-Gaussian cases. A Kenalls $tau$ function, which possesses identical e
Functional principal component analysis is essential in functional data analysis, but the inferences will become unconvincing when some non-Gaussian characteristics occur, such as heavy tail and skewness. The focus of this paper is to develop a robus
We revisit the problem of estimating the mean of a real-valued distribution, presenting a novel estimator with sub-Gaussian convergence: intuitively, our estimator, on any distribution, is as accurate as the sample mean is for the Gaussian distributi
Performance of nuclear threat detection systems based on gamma-ray spectrometry often strongly depends on the ability to identify the part of measured signal that can be attributed to background radiation. We have successfully applied a method based