ترغب بنشر مسار تعليمي؟ اضغط هنا

Spectral linear matrix inequalities

198   0   0.0 ( 0 )
 نشر من قبل Mario Kummer
 تاريخ النشر 2020
  مجال البحث
والبحث باللغة English
 تأليف Mario Kummer




اسأل ChatGPT حول البحث

We prove, under a certain representation theoretic assumption, that the set of real symmetric matrices, whose eigenvalues satisfy a linear matrix inequality, is itself a spectrahedron. The main application is that derivative relaxations of the positive semidefinite cone are spectrahedra. From this we further deduce statements on their Wronskians. These imply that Newtons inequalities, as well as a strengthening of the correlation inequalities for hyperbolic polynomials, can be expressed as sums of squares.



قيم البحث

اقرأ أيضاً

Given two symmetric and positive semidefinite square matrices $A, B$, is it true that any matrix given as the product of $m$ copies of $A$ and $n$ copies of $B$ in a particular sequence must be dominated in the spectral norm by the ordered matrix pro duct $A^m B^n$? For example, is $$ | AABAABABB | leq | AAAAABBBB | ? $$ Drury has characterized precisely which disordered words have the property that an inequality of this type holds for all matrices $A,B$. However, the $1$-parameter family of counterexamples Drury constructs for these characterizations is comprised of $3 times 3$ matrices, and thus as stated the characterization applies only for $N times N$ matrices with $N geq 3$. In contrast, we prove that for $2 times 2$ matrices, the general rearrangement inequality holds for all disordered words. We also show that for larger $N times N$ matrices, the general rearrangement inequality holds for all disordered words, for most $A,B$ (in a sense of full measure) that are sufficiently small perturbations of the identity.
This paper addresses the development of a covariance matrix self-adaptation evolution strategy (CMSA-ES) for solving optimization problems with linear constraints. The proposed algorithm is referred to as Linear Constraint CMSA-ES (lcCMSA-ES). It use s a specially built mutation operator together with repair by projection to satisfy the constraints. The lcCMSA-ES evolves itself on a linear manifold defined by the constraints. The objective function is only evaluated at feasible search points (interior point method). This is a property often required in application domains such as simulation optimization and finite element methods. The algorithm is tested on a variety of different test problems revealing considerable results.
In this paper we show how to recover a spectral approximations to broad classes of structured matrices using only a polylogarithmic number of adaptive linear measurements to either the matrix or its inverse. Leveraging this result we obtain faster al gorithms for variety of linear algebraic problems. Key results include: $bullet$ A nearly linear time algorithm for solving the inverse of symmetric $M$-matrices, a strict superset of Laplacians and SDD matrices. $bullet$ An $tilde{O}(n^2)$ time algorithm for solving $n times n$ linear systems that are constant spectral approximations of Laplacians or more generally, SDD matrices. $bullet$ An $tilde{O}(n^2)$ algorithm to recover a spectral approximation of a $n$-vertex graph using only $tilde{O}(1)$ matrix-vector multiplies with its Laplacian matrix. The previous best results for each problem either used a trivial number of queries to exactly recover the matrix or a trivial $O(n^omega)$ running time, where $omega$ is the matrix multiplication constant. We achieve these results by generalizing recent semidefinite programming based linear sized sparsifier results of Lee and Sun (2017) and providing iterative methods inspired by the semistreaming sparsification results of Kapralov, Lee, Musco, Musco and Sidford (2014) and input sparsity time linear system solving results of Li, Miller, and Peng (2013). We hope that by initiating study of these natural problems, expanding the robustness and scope of recent nearly linear time linear system solving research, and providing general matrix recovery machinery this work may serve as a stepping stone for faster algorithms.
137 - Pham H. Tiep , Van H. Vu 2015
In 1943, Littlewood and Offord proved the first anti-concentration result for sums of independent random variables. Their result has since then been strengthened and generalized by generations of researchers, with applications in several areas of mat hematics. In this paper, we present the first non-abelian analogue of Littlewood-Offord result, a sharp anti-concentration inequality for products of independent random variables.
A central tool in the study of nonhomogeneous random matrices, the noncommutative Khintchine inequality of Lust-Piquard and Pisier, yields a nonasymptotic bound on the spectral norm of general Gaussian random matrices $X=sum_i g_i A_i$ where $g_i$ ar e independent standard Gaussian variables and $A_i$ are matrix coefficients. This bound exhibits a logarithmic dependence on dimension that is sharp when the matrices $A_i$ commute, but often proves to be suboptimal in the presence of noncommutativity. In this paper, we develop nonasymptotic bounds on the spectrum of arbitrary Gaussian random matrices that can capture noncommutativity. These bounds quantify the degree to which the deterministic matrices $A_i$ behave as though they are freely independent. This intrinsic freeness phenomenon provides a powerful tool for the study of various questions that are outside the reach of classical methods of random matrix theory. Our nonasymptotic bounds are easily applicable in concrete situations, and yield sharp results in examples where the noncommutative Khintchine inequality is suboptimal. When combined with a linearization argument, our bounds imply strong asymptotic freeness (in the sense of Haagerup-Thorbj{o}rnsen) for a remarkably general class of Gaussian random matrix models, including matrices that may be very sparse and that lack any special symmetries. Beyond the Gaussian setting, we develop matrix concentration inequalities that capture noncommutativity for general sums of independent random matrices, which arise in many problems of pure and applied mathematics.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا