Do you want to publish a course? Click here

High dimensional normality of noisy eigenvectors

329   0   0.0 ( 0 )
 Added by Jake Marcinek
 Publication date 2020
  fields
and research's language is English




Ask ChatGPT about the research

We study joint eigenvector distributions for large symmetric matrices in the presence of weak noise. Our main result asserts that every submatrix in the orthogonal matrix of eigenvectors converges to a multidimensional Gaussian distribution. The proof involves analyzing the stochastic eigenstate equation (SEE) which describes the Lie group valued flow of eigenvectors induced by matrix valued Brownian motion. We consider the associated colored eigenvector moment flow defining an SDE on a particle configuration space. This flow extends the eigenvector moment flow first introduced in Bourgade and Yau (2017) to the multicolor setting. However, it is no longer driven by an underlying Markov process on configuration space due to the lack of positivity in the semigroup kernel. Nevertheless, we prove the dynamics admit sufficient averaged decay and contractive properties. This allows us to establish optimal time of relaxation to equilibrium for the colored eigenvector moment flow and prove joint asymptotic normality for eigenvectors. Applications in random matrix theory include the explicit computations of joint eigenvector distributions for general Wigner type matrices and sparse graph models when corresponding eigenvalues lie in the bulk of the spectrum, as well as joint eigenvector distributions for Levy matrices when the eigenvectors correspond to small energy levels.



rate research

Read More

We will prove the Berry-Esseen theorem for the number counting function of the circular $beta$-ensemble (C$beta$E), which will imply the central limit theorem for the number of points in arcs. We will prove the main result by estimating the characteristic functions of the Prufer phases and the number counting function, which will imply the the uniform upper and lower bounds of their variance. We also show that the similar results hold for the Sine$_beta$ process. As a direct application of the uniform variance bound, we can prove the normality of the linear statistics when the test function $f(theta)in W^{1,p}(S^1)$ for some $pin(1,+infty)$.
We prove localization with high probability on sets of size of order $N/log N$ for the eigenvectors of non-Hermitian finitely banded $Ntimes N$ Toeplitz matrices $P_N$ subject to small random perturbations, in a very general setting. As perturbation we consider $Ntimes N$ random matrices with independent entries of zero mean, finite moments, and which satisfy an appropriate anti-concentration bound. We show via a Grushin problem that an eigenvector for a given eigenvalue $z$ is well approximated by a random linear combination of the singular vectors of $P_N-z$ corresponding to its small singular values. We prove precise probabilistic bounds on the local distribution of the eigenvalues of the perturbed matrix and provide a detailed analysis of the singular vectors to conclude the localization result.
Multidimensional Scaling (MDS) is a classical technique for embedding data in low dimensions, still in widespread use today. Originally introduced in the 1950s, MDS was not designed with high-dimensional data in mind; while it remains popular with data analysis practitioners, no doubt it should be adapted to the high-dimensional data regime. In this paper we study MDS under modern setting, and specifically, high dimensions and ambient measurement noise. We show that, as the ambient noise level increase, MDS suffers a sharp breakdown that depends on the data dimension and noise level, and derive an explicit formula for this breakdown point in the case of white noise. We then introduce MDS+, an extremely simple variant of MDS, which applies a carefully derived shrinkage nonlinearity to the eigenvalues of the MDS similarity matrix. Under a loss function measuring the embedding quality, MDS+ is the unique asymptotically optimal shrinkage function. We prove that MDS+ offers improved embedding, sometimes significantly so, compared with classical MDS. Furthermore, MDS+ does not require external estimates of the embedding dimension (a famous difficulty in classical MDS), as it calculates the optimal dimension into which the data should be embedded.
We present a basis of eigenvectors for the graph building operators acting along the mirror channel of planar fishnet Feynman integrals in $d$-dimensions. The eigenvectors of a fishnet lattice of length $L$ depend on a set of $L$ quantum numbers $(u_k,l_k)$, each associated with the rapidity and bound-state index of a lattice excitation. Each excitation is a particle in $(1+1)$-dimensions with $O(d)$ internal symmetry, and the wave-functions are formally constructed with a set of creation/annihilation operators that satisfy the corresponding Zamolodchikovs-Faddeev algebra. These properties are proved via the representation - new to our knowledge - of the matrix elements of the fused R-matrix with $O(d)$ symmetry as integral operators on the functions of two spacetime points. The spectral decomposition of a fishnet integral we achieved can be applied to the computation of Basso-Dixon integrals in higher dimensions.
We study the number of facets of the convex hull of n independent standard Gaussian points in d-dimensional Euclidean space. In particular, we are interested in the expected number of facets when the dimension is allowed to grow with the sample size. We establish an explicit asymptotic formula that is valid whenever d/n tends to zero. We also obtain the asymptotic value when d is close to n.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا