Do you want to publish a course? Click here

New properties for certain positive semidefinite matrices

118   0   0.0 ( 0 )
 Added by Minghua Lin
 Publication date 2016
  fields
and research's language is English
 Authors Minghua Lin




Ask ChatGPT about the research

We bring in some new notions associated with $2times 2$ block positive semidefinite matrices. These notions concern the inequalities between the singular values of the off diagonal blocks and the eigenvalues of the arithmetic mean or geometric mean of the diagonal blocks. We investigate some relations between them. Many examples are included to illustrate these relations.

rate research

Read More

The robust PCA of covariance matrices plays an essential role when isolating key explanatory features. The currently available methods for performing such a low-rank plus sparse decomposition are matrix specific, meaning, those algorithms must re-run for every new matrix. Since these algorithms are computationally expensive, it is preferable to learn and store a function that instantaneously performs this decomposition when evaluated. Therefore, we introduce Denise, a deep learning-based algorithm for robust PCA of covariance matrices, or more generally of symmetric positive semidefinite matrices, which learns precisely such a function. Theoretical guarantees for Denise are provided. These include a novel universal approximation theorem adapted to our geometric deep learning problem, convergence to an optimal solution of the learning problem and convergence of the training scheme. Our experiments show that Denise matches state-of-the-art performance in terms of decomposition quality, while being approximately 2000x faster than the state-of-the-art, PCP, and 200x faster than the current speed optimized method, fast PCP.
Not every positive functional defined on bi-variate polynomials of a prescribed degree bound is represented by the integration against a positive measure. We isolate a couple of conditions filling this gap, either by restricting the class of polynomials to harmonic ones, or imposing the vanishing of a defect indicator. Both criteria offer constructive cubature formulas and they are obtained via well known matrix analysis techniques involving either the dilation of a contractive matrix to a unitary one or the specific structure of the Hessenberg matrix associated to the multiplier by the underlying complex variable.
163 - Palle Jorgensen , Feng Tian 2019
With view to applications in stochastic analysis and geometry, we introduce a new correspondence for positive definite kernels (p.d.) $K$ and their associated reproducing kernel Hilbert spaces. With this we establish two kinds of factorizations: (i) Probabilistic: Starting with a positive definite kernel $K$ we analyze associated Gaussian processes $V$. Properties of the Gaussian processes will be derived from certain factorizations of $K$, arising as a covariance kernel of $V$. (ii) Geometric analysis: We discuss families of measure spaces arising as boundaries for $K$. Our results entail an analysis of a partial order on families of p.d. kernels, a duality for operators and frames, optimization, Karhunen--Lo`eve expansions, and factorizations. Applications include a new boundary analysis for the Drury-Arveson kernel, and for certain fractals arising as iterated function systems; and an identification of optimal feature spaces in machine learning models.
We construct a quantum-inspired classical algorithm for computing the permanent of Hermitian positive semidefinite matrices, by exploiting a connection between these mathematical structures and the boson sampling model. Specifically, the permanent of a Hermitian positive semidefinite matrix can be expressed in terms of the expected value of a random variable, which stands for a specific photon-counting probability when measuring a linear-optically evolved random multimode coherent state. Our algorithm then approximates the matrix permanent from the corresponding sample mean and is shown to run in polynomial time for various sets of Hermitian positive semidefinite matrices, achieving a precision that improves over known techniques. This work illustrates how quantum optics may benefit algorithms development.
We design a deterministic polynomial time $c^n$ approximation algorithm for the permanent of positive semidefinite matrices where $c=e^{gamma+1}simeq 4.84$. We write a natural convex relaxation and show that its optimum solution gives a $c^n$ approximation of the permanent. We further show that this factor is asymptotically tight by constructing a family of positive semidefinite matrices.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا