Do you want to publish a course? Click here

Metrics Induced by Jensen-Shannon and Related Divergences on Positive Definite Matrices

100   0   0.0 ( 0 )
 Added by Suvrit Sra
 Publication date 2019
and research's language is English
 Authors Suvrit Sra




Ask ChatGPT about the research

We study metric properties of symmetric divergences on Hermitian positive definite matrices. In particular, we prove that the square root of these divergences is a distance metric. As a corollary we obtain a proof of the metric property for Quantum Jensen-Shannon-(Tsallis) divergences (parameterized by $alphain [0,2]$), which in turn (for $alpha=1$) yields a proof of the metric property of the Quantum Jensen-Shannon divergence that was conjectured by Lamberti emph{et al.} a decade ago (emph{Metric character of the quantum Jensen-Shannon divergence}, Phy. Rev. A, textbf{79}, (2008).) A somewhat more intricate argument also establishes metric properties of Jensen-Renyi divergences (for $alpha in (0,1)$), and outlines a technique that may be of independent interest.



rate research

Read More

188 - F. Hiai , D. Petz 2008
The Riemannian metric on the manifold of positive definite matrices is defined by a kernel function $phi$ in the form $K_D^phi(H,K)=sum_{i,j}phi(lambda_i,lambda_j)^{-1} Tr P_iHP_jK$ when $sum_ilambda_iP_i$ is the spectral decomposition of the foot point $D$ and the Hermitian matrices $H,K$ are tangent vectors. For such kernel metrics the tangent space has an orthogonal decomposition. The pull-back of a kernel metric under a mapping $Dmapsto G(D)$ is a kernel metric as well. Several Riemannian geometries of the literature are particular cases, for example, the Fisher-Rao metric for multivariate Gaussian distributions and the quantum Fisher information. In the paper the case $phi(x,y)=M(x,y)^theta$ is mostly studied when $M(x,y)$ is a mean of the positive numbers $x$ and $y$. There are results about the geodesic curves and geodesic distances. The geometric mean, the logarithmic mean and the root mean are important cases.
Representations in the form of Symmetric Positive Definite (SPD) matrices have been popularized in a variety of visual learning applications due to their demonstrated ability to capture rich second-order statistics of visual data. There exist several similarity measures for comparing SPD matrices with documented benefits. However, selecting an appropriate measure for a given problem remains a challenge and in most cases, is the result of a trial-and-error process. In this paper, we propose to learn similarity measures in a data-driven manner. To this end, we capitalize on the alphabeta-log-det divergence, which is a meta-divergence parametrized by scalars alpha and beta, subsuming a wide family of popular information divergences on SPD matrices for distinct and discrete values of these parameters. Our key idea is to cast these parameters in a continuum and learn them from data. We systematically extend this idea to learn vector-valued parameters, thereby increasing the expressiveness of the underlying non-linear measure. We conjoin the divergence learning problem with several standard tasks in machine learning, including supervised discriminative dictionary learning and unsupervised SPD matrix clustering. We present Riemannian gradient descent schemes for optimizing our formulations efficiently, and show the usefulness of our method on eight standard computer vision tasks.
In this paper, we introduce properly-invariant diagonality measures of Hermitian positive-definite matrices. These diagonality measures are defined as distances or divergences between a given positive-definite matrix and its diagonal part. We then give closed-form expressions of these diagonality measures and discuss their invariance properties. The diagonality measure based on the log-determinant $alpha$-divergence is general enough as it includes a diagonality criterion used by the signal processing community as a special case. These diagonality measures are then used to formulate minimization problems for finding the approximate joint diagonalizer of a given set of Hermitian positive-definite matrices. Numerical computations based on a modified Newton method are presented and commented.
333 - Meng Cao 2020
Matrix-product codes over finite fields are an important class of long linear codes by combining several commensurate shorter linear codes with a defining matrix over finite fields. The construction of matrix-product codes with certain self-orthogonality over finite fields is an effective way to obtain good $q$-ary quantum codes of large length. Specifically, it follows from CSS construction (resp. Hermitian construction) that a matrix-product code over $mathbb{F}_{q}$ (resp. $mathbb{F}_{q^{2}}$) which is Euclidean dual-containing (resp. Hermitian dual-containing) can produce a $q$-ary quantum code. In order to obtain such matrix-product codes, a common way is to construct quasi-orthogonal matrices (resp. quasi-unitary matrices) as the defining matrices of matrix-product codes over $mathbb{F}_{q}$ (resp. $mathbb{F}_{q^{2}}$). The usage of NSC quasi-orthogonal matrices or NSC quasi-unitary matrices in this process enables the minimum distance lower bound of the corresponding quantum codes to reach its optimum. This article has two purposes: the first is to summarize some results of this topic obtained by the author of this article and his cooperators in cite{Cao2020Constructioncaowang,Cao2020New,Cao2020Constructionof}; the second is to add some new results on quasi-orthogonal matrices (resp. quasi-unitary matrices), Euclidean dual-containing (resp. Hermitian dual-containing) matrix-product codes and $q$-ary quantum codes derived from these newly constructed matrix-product codes.
72 - Tomohiro Nishiyama 2020
The divergence minimization problem plays an important role in various fields. In this note, we focus on differentiable and strictly convex divergences. For some minimization problems, we show the minimizer conditions and the uniqueness of the minimizer without assuming a specific form of divergences. Furthermore, we show geometric properties related to the minimization problems.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا