Do you want to publish a course? Click here

Sparse random tensors: Concentration, regularization and applications

140   0   0.0 ( 0 )
 Added by Yizhe Zhu
 Publication date 2019
  fields
and research's language is English




Ask ChatGPT about the research

We prove a non-asymptotic concentration inequality for the spectral norm of sparse inhomogeneous random tensors with Bernoulli entries. For an order-$k$ inhomogeneous random tensor $T$ with sparsity $p_{max}geq frac{clog n}{n }$, we show that $|T-mathbb E T|=O(sqrt{n p_{max}}log^{k-2}(n))$ with high probability. The optimality of this bound up to polylog factors is provided by an information theoretic lower bound. By tensor unfolding, we extend the range of sparsity to $p_{max}geq frac{clog n}{n^{m}}$ with $1leq mleq k-1$ and obtain concentration inequalities for different sparsity regimes. We also provide a simple way to regularize $T$ such that $O(sqrt{n^{m}p_{max}})$ concentration still holds down to sparsity $p_{max}geq frac{c}{n^{m}}$ with $k/2leq mleq k-1$. We present our concentration and regularization results with two applications: (i) a randomized construction of hypergraphs of bounded degrees with good expander mixing properties, (ii) concentration of sparsified tensors under uniform sampling.



rate research

Read More

344 - Xinjia Chen 2013
We derive simple concentration inequalities for bounded random vectors, which generalize Hoeffdings inequalities for bounded scalar random variables. As applications, we apply the general results to multinomial and Dirichlet distributions to obtain multivariate concentration inequalities.
We study the spectrum of a random multigraph with a degree sequence ${bf D}_n=(D_i)_{i=1}^n$ and average degree $1 ll omega_n ll n$, generated by the configuration model, and also the spectrum of the analogous random simple graph. We show that, when the empirical spectral distribution (ESD) of $omega_n^{-1} {bf D}_n $ converges weakly to a limit $ u$, under mild moment assumptions (e.g., $D_i/omega_n$ are i.i.d. with a finite second moment), the ESD of the normalized adjacency matrix converges in probability to $ uboxtimes sigma_{rm sc}$, the free multiplicative convolution of $ u$ with the semicircle law. Relating this limit with a variant of the Marchenko--Pastur law yields the continuity of its density (away from zero), and an effective procedure for determining its support. Our proof of convergence is based on a coupling between the random simple graph and multigraph with the same degrees, which might be of independent interest. We further construct and rely on a coupling of the multigraph to an inhomogeneous ErdH{o}s-Renyi graph with the target ESD, using three intermediate random graphs, with a negligible fraction of edges modified in each step.
62 - Vivek Dewan 2020
Let $T$ be a random ergodic pseudometric over $mathbb R^d$. This setting generalizes the classical emph{first passage percolation} (FPP) over $mathbb Z^d$. We provide simple conditions on $T$, the decay of instant one-arms and exponential quasi-independence, that ensure the positivity of its time constants, that is almost surely, the pseudo-distance given by $T$ from the origin is asymptotically a norm. Combining this general result with previously known ones, we prove that The known phase transition for Gaussian percolation in the case of fields with positive correlations with exponentially fast decayholds for Gaussian FPP, including the natural Bargmann-Fock model; The known phase transition for Voronoi percolation also extends to the associated FPP; The same happens for Boolean percolation for radii with exponential tails, a result which was known without this condition. We prove the positivity of the constant for random continuous Riemannian metrics, including cases with infinite correlations in dimension $d=2$. Finally, we show that the critical exponent for the one-arm, if exists, is bounded above by $d-1$. This holds forbond Bernoulli percolation, planar Gaussian fields, planar Voronoi percolation, and Boolean percolation with exponential small tails.
In this article, we analyze Hamiltonian Monte Carlo (HMC) by placing it in the setting of Riemannian geometry using the Jacobi metric, so that each step corresponds to a geodesic on a suitable Riemannian manifold. We then combine the notion of curvature of a Markov chain due to Joulin and Ollivier with the classical sectional curvature from Riemannian geometry to derive error bounds for HMC in important cases, where we have positive curvature. These cases include several classical distributions such as multivariate Gaussians, and also distributions arising in the study of Bayesian image registration. The theoretical development suggests the sectional curvature as a new diagnostic tool for convergence for certain Markov chains.
75 - Boris Landa 2020
It is well known that any positive matrix can be scaled to have prescribed row and column sums by multiplying its rows and columns by certain positive scaling factors (which are unique up to a positive scalar). This procedure is known as matrix scaling, and has found numerous applications in operations research, economics, image processing, and machine learning. In this work, we investigate the behavior of the scaling factors and the resulting scaled matrix when the matrix to be scaled is random. Specifically, letting $widetilde{A}inmathbb{R}^{Mtimes N}$ be a positive and bounded random matrix whose entries assume a certain type of independence, we provide a concentration inequality for the scaling factors of $widetilde{A}$ around those of $A = mathbb{E}[widetilde{A}]$. This result is employed to bound the convergence rate of the scaling factors of $widetilde{A}$ to those of $A$, as well as the concentration of the scaled version of $widetilde{A}$ around the scaled version of $A$ in operator norm, as $M,Nrightarrowinfty$. When the entries of $widetilde{A}$ are independent, $M=N$, and all prescribed row and column sums are $1$ (i.e., doubly-stochastic matrix scaling), both of the previously-mentioned bounds are $mathcal{O}(sqrt{log N / N})$ with high probability. We demonstrate our results in several simulations.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا