Do you want to publish a course? Click here

Towards Almost Global Synchronization on the Stiefel Manifold

115   0   0.0 ( 0 )
 Added by Johan Markdahl
 Publication date 2018
  fields
and research's language is English




Ask ChatGPT about the research

A graph $mathcal{G}$ is referred to as $mathsf{S}^1$-synchronizing if, roughly speaking, the Kuramoto-like model whose interaction topology is given by $mathcal{G}$ synchronizes almost globally. The Kuramoto model evolves on the unit circle, ie the $1$-sphere $mathsf{S}^1$. This paper concerns generalizations of the Kuramoto-like model and the concept of synchronizing graphs on the Stiefel manifold $mathsf{St}(p,n)$. Previous work on state-space oscillators have largely been influenced by results and techniques that pertain to the $mathsf{S}^1$-case. It has recently been shown that all connected graphs are $mathsf{S}^n$-synchronizing for all $ngeq2$. The previous point of departure may thus have been overly conservative. The $n$-sphere is a special case of the Stiefel manifold, namely $mathsf{St}(1,n+1)$. As such, it is natural to ask for the extent to which the results on $mathsf{S}^{n}$ can be extended to the Stiefel manifold. This paper shows that all connected graphs are $mathsf{St}(p,n)$-synchronizing provided the pair $(p,n)$ satisfies $pleq tfrac{2n}{3}-1$.



rate research

Read More

The symplectic Stiefel manifold, denoted by $mathrm{Sp}(2p,2n)$, is the set of linear symplectic maps between the standard symplectic spaces $mathbb{R}^{2p}$ and $mathbb{R}^{2n}$. When $p=n$, it reduces to the well-known set of $2ntimes 2n$ symplectic matrices. Optimization problems on $mathrm{Sp}(2p,2n)$ find applications in various areas, such as optics, quantum physics, numerical linear algebra and model order reduction of dynamical systems. The purpose of this paper is to propose and analyze gradient-descent methods on $mathrm{Sp}(2p,2n)$, where the notion of gradient stems from a Riemannian metric. We consider a novel Riemannian metric on $mathrm{Sp}(2p,2n)$ akin to the canonical metric of the (standard) Stiefel manifold. In order to perform a feasible step along the antigradient, we develop two types of search strategies: one is based on quasi-geodesic curves, and the other one on the symplectic Cayley transform. The resulting optimization algorithms are proved to converge globally to critical points of the objective function. Numerical experiments illustrate the efficiency of the proposed methods.
Riemannian optimization has drawn a lot of attention due to its wide applications in practice. Riemannian stochastic first-order algorithms have been studied in the literature to solve large-scale machine learning problems over Riemannian manifolds. However, most of the existing Riemannian stochastic algorithms require the objective function to be differentiable, and they do not apply to the case where the objective function is nonsmooth. In this paper, we present two Riemannian stochastic proximal gradient methods for minimizing nonsmooth function over the Stiefel manifold. The two methods, named R-ProxSGD and R-ProxSPB, are generalizations of proximal SGD and proximal SpiderBoost in Euclidean setting to the Riemannian setting. Analysis on the incremental first-order oracle (IFO) complexity of the proposed algorithms is provided. Specifically, the R-ProxSPB algorithm finds an $epsilon$-stationary point with $mathcal{O}(epsilon^{-3})$ IFOs in the online case, and $mathcal{O}(n+sqrt{n}epsilon^{-3})$ IFOs in the finite-sum case with $n$ being the number of summands in the objective. Experimental results on online sparse PCA and robust low-rank matrix completion show that our proposed methods significantly outperform the existing methods that uses Riemannian subgradient information.
Strictly enforcing orthonormality constraints on parameter matrices has been shown advantageous in deep learning. This amounts to Riemannian optimization on the Stiefel manifold, which, however, is computationally expensive. To address this challenge, we present two main contributions: (1) A new efficient retraction map based on an iterative Cayley transform for optimization updates, and (2) An implicit vector transport mechanism based on the combination of a projection of the momentum and the Cayley transform on the Stiefel manifold. We specify two new optimization algorithms: Cayley SGD with momentum, and Cayley ADAM on the Stiefel manifold. Convergence of Cayley SGD is theoretically analyzed. Our experiments for CNN training demonstrate that both algorithms: (a) Use less running time per iteration relative to existing approaches that enforce orthonormality of CNN parameters; and (b) Achieve faster convergence rates than the baseline SGD and ADAM algorithms without compromising the performance of the CNN. Cayley SGD and Cayley ADAM are also shown to reduce the training time for optimizing the unitary transition matrices in RNNs.
In this paper, we consider the geometric landscape connection of the widely studied manifold and factorization formulations in low-rank positive semidefinite (PSD) and general matrix optimization. We establish an equivalence on the set of first-order stationary points (FOSPs) and second-order stationary points (SOSPs) between the manifold and the factorization formulations. We further give a sandwich inequality on the spectrum of Riemannian and Euclidean Hessians at FOSPs, which can be used to transfer more geometric properties from one formulation to another. Similarities and differences on the landscape connection under the PSD case and the general case are discussed. To the best of our knowledge, this is the first geometric landscape connection between the manifold and the factorization formulations for handling rank constraints. In the general low-rank matrix optimization, the landscape connection of two factorization formulations (unregularized and regularized ones) is also provided. By applying these geometric landscape connections, we are able to solve unanswered questions in literature and establish stronger results in the applications on geometric analysis of phase retrieval, well-conditioned low-rank matrix optimization, and the role of regularization in factorization arising from machine learning and signal processing.
This paper introduces a framework for solving time-autonomous nonlinear infinite horizon optimal control problems, under the assumption that all minimizers satisfy Pontryagins necessary optimality conditions. In detail, we use methods from the field of symplectic geometry to analyze the eigenvalues of a Koopman operator that lifts Pontryagins differential equation into a suitably defined infinite dimensional symplectic space. This has the advantage that methods from the field of spectral analysis can be used to characterize globally optimal control laws. A numerical method for constructing optimal feedback laws for nonlinear systems is then obtained by computing the eigenvalues and eigenvectors of a matrix that is obtained by projecting the Pontryagin-Koopman operator onto a finite dimensional space. We illustrate the effectiveness of this approach by computing accurate approximations of the optimal nonlinear feedback law for a Van der Pol control system, which cannot be stabilized by a linear control law.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا