We study stochastic projection-free methods for constrained optimization of smooth functions on Riemannian manifolds, i.e., with additional constraints beyond the parameter domain being a manifold. Specifically, we introduce stochastic Riemannian Frank-Wolfe methods for nonconvex and geodesically convex problems. We present algorithms for both purely stochastic optimization and finite-sum problems. For the latter, we develop variance-reduced methods, including a Riemannian adaptation of the recently proposed Spider technique. For all settings, we recover convergence rates that are comparable to the best-known rates for their Euclidean counterparts. Finally, we discuss applications to two classic tasks: The computation of the Karcher mean of positive definite matrices and Wasserstein barycenters for multivariate normal distributions. For both tasks, stochastic Fw methods yield state-of-the-art empirical performance.
In this paper we develop the theory of parametric polynomial regression in Riemannian manifolds and Lie groups. We show application of Riemannian polynomial regression to shape analysis in Kendall shape space. Results are presented, showing the power of polynomial regression on the classic rat skull growth data of Bookstein as well as the analysis of the shape changes associated with aging of the corpus callosum from the OASIS Alzheimers study.
We study the propagator of the wave equation on a closed Riemannian manifold $M$. We propose a geometric approach to the construction of the propagator as a single oscillatory integral global both in space and in time with a distinguished complex-valued phase function. This enables us to provide a global invariant definition of the full symbol of the propagator - a scalar function on the cotangent bundle - and an algorithm for the explicit calculation of its homogeneous components. The central part of the paper is devoted to the detailed analysis of the subprincipal symbol; in particular, we derive its explicit small time asymptotic expansion. We present a general geometric construction that allows one to visualise topological obstructions and describe their circumvention with the use of a complex-valued phase function. We illustrate the general framework with explicit examples in dimension two.
We develop a new Riemannian descent algorithm that relies on momentum to improve over existing first-order methods for geodesically convex optimization. In contrast, accelerated convergence rates proved in prior work have only been shown to hold for geodesically strongly-convex objective functions. We further extend our algorithm to geodesically weakly-quasi-convex objectives. Our proofs of convergence rely on a novel estimate sequence that illustrates the dependency of the convergence rate on the curvature of the manifold. We validate our theoretical results empirically on several optimization problems defined on the sphere and on the manifold of positive definite matrices.
We study numerical optimisation algorithms that use zeroth-order information to minimise time-varying geodesically-convex cost functions on Riemannian manifolds. In the Euclidean setting, zeroth-order algorithms have received a lot of attention in both the time-varying and time-invariant case. However, the extension to Riemannian manifolds is much less developed. We focus on Hadamard manifolds, which are a special class of Riemannian manifolds with global nonpositive curvature that offer convenient grounds for the generalisation of convexity notions. Specifically, we derive bounds on the expected instantaneous tracking error, and we provide algorithm parameter values that minimise the algorithms performance. Our results illustrate how the manifold geometry in terms of the sectional curvature affects these bounds. Additionally, we provide dynamic regret bounds for this online optimisation setting. To the best of our knowledge, these are the first bounds on tracking error and dynamic regret for online zeroth-order Riemannian optimisation. Lastly, via numerical simulations, we demonstrate the applicability of our algorithm on an online Karcher mean problem.