ترغب بنشر مسار تعليمي؟ اضغط هنا

Optimization on flag manifolds

49   0   0.0 ( 0 )
 نشر من قبل Lek-Heng Lim
 تاريخ النشر 2019
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

A flag is a sequence of nested subspaces. Flags are ubiquitous in numerical analysis, arising in finite elements, multigrid, spectral, and pseudospectral methods for numerical PDE; they arise in the form of Krylov subspaces in matrix computations, and as multiresolution analysis in wavelets constructions. They are common in statistics too --- principal component, canonical correlation, and correspondence analyses may all be viewed as methods for extracting flags from a data set. The main goal of this article is to develop the tools needed for optimizing over a set of flags, which is a smooth manifold called the flag manifold, and it contains the Grassmannian as the simplest special case. We will derive closed-form analytic expressions for various differential geometric objects required for Riemannian optimization algorithms on the flag manifold; introducing various systems of extrinsic coordinates that allow us to parameterize points, metrics, tangent spaces, geodesics, distance, parallel transport, gradients, Hessians in terms of matrices and matrix operations; and thereby permitting us to formulate steepest descent, conjugate gradient, and Newton algorithms on the flag manifold using only standard numerical linear algebra.



قيم البحث

اقرأ أيضاً

244 - Andrea Altomani 2009
We investigate the $CR$ geometry of the orbits $M$ of a real form $G_0$ of a complex simple group $G$ in a complex flag manifold $X=G/Q$. We are mainly concerned with finite type, Levi non-degeneracy conditions, canonical $G_0$-equivariant and Mostow fibrations, and topological properties of the orbits.
We study, from the point of view of CR geometry, the orbits M of a real form G of a complex semisimple Lie group G in a complex flag manifold G/Q. In particular we characterize those that are of finite type and satisfy some Levi nondegeneracy conditi ons. These properties are also graphically described by attaching to them some cross-marked diagrams that generalize those for minimal orbits that we introduced in a previous paper. By constructing canonical fibrations over real flag manifolds, with simply connected complex fibers, we are also able to compute their fundamental group.
We develop a new Riemannian descent algorithm that relies on momentum to improve over existing first-order methods for geodesically convex optimization. In contrast, accelerated convergence rates proved in prior work have only been shown to hold for geodesically strongly-convex objective functions. We further extend our algorithm to geodesically weakly-quasi-convex objectives. Our proofs of convergence rely on a novel estimate sequence that illustrates the dependency of the convergence rate on the curvature of the manifold. We validate our theoretical results empirically on several optimization problems defined on the sphere and on the manifold of positive definite matrices.
We consider optimization problems on Riemannian manifolds with equality and inequality constraints, which we call Riemannian nonlinear optimization (RNLO) problems. Although they have numerous applications, the existing studies on them are limited es pecially in terms of algorithms. In this paper, we propose Riemannian sequential quadratic optimization (RSQO) that uses a line-search technique with an ell_1 penalty function as an extension of the standard SQO algorithm for constrained nonlinear optimization problems in Euclidean spaces to Riemannian manifolds. We prove its global convergence to a Karush-Kuhn-Tucker point of the RNLO problem by means of parallel transport and the exponential mapping. Furthermore, we establish its local quadratic convergence by analyzing the relationship between sequences generated by RSQO and the Riemannian Newton method. Ours is the first algorithm that has both global and local convergence properties for constrained nonlinear optimization on Riemannian manifolds. Empirical results show that RSQO finds solutions more stably and with higher accuracy compared with the existing Riemannian penalty and augmented Lagrangian methods.
101 - Melanie Weber , Suvrit Sra 2019
We study stochastic projection-free methods for constrained optimization of smooth functions on Riemannian manifolds, i.e., with additional constraints beyond the parameter domain being a manifold. Specifically, we introduce stochastic Riemannian Fra nk-Wolfe methods for nonconvex and geodesically convex problems. We present algorithms for both purely stochastic optimization and finite-sum problems. For the latter, we develop variance-reduced methods, including a Riemannian adaptation of the recently proposed Spider technique. For all settings, we recover convergence rates that are comparable to the best-known rates for their Euclidean counterparts. Finally, we discuss applications to two classic tasks: The computation of the Karcher mean of positive definite matrices and Wasserstein barycenters for multivariate normal distributions. For both tasks, stochastic Fw methods yield state-of-the-art empirical performance.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا