Do you want to publish a course? Click here

On the regularity of abnormal minimizers for rank $2$ sub-Riemannian structures

195   0   0.0 ( 0 )
 Added by Davide Barilari
 Publication date 2018
  fields
and research's language is English




Ask ChatGPT about the research

We prove the $C^{1}$ regularity for a class of abnormal length-minimizers in rank $2$ sub-Riemannian structures. As a consequence of our result, all length-minimizers for rank $2$ sub-Riemannian structures of step up to $4$ are of class $C^{1}$.



rate research

Read More

A helical CR structure is a decomposition of a real Euclidean space into an even-dimensional horizontal subspace and its orthogonal vertical complement, together with an almost complex structure on the horizontal space and a marked vector in the vertical space. We prove an equivalence between such structures and step two Carnot groups equipped with a distinguished normal geodesic, and also between such structures and smooth real curves whose derivatives have constant Euclidean norm. As a consequence, we relate step two Carnot groups equipped with sub-Riemannian geodesics with this family of curves. The restriction to the unit circle of certain planar homogeneous polynomial mappings gives an instructive class of examples. We describe these examples in detail.
150 - Bin Gao , P.-A. Absil 2021
The low-rank matrix completion problem can be solved by Riemannian optimization on a fixed-rank manifold. However, a drawback of the known approaches is that the rank parameter has to be fixed a priori. In this paper, we consider the optimization problem on the set of bounded-rank matrices. We propose a Riemannian rank-adaptive method, which consists of fixed-rank optimization, rank increase step and rank reduction step. We explore its performance applied to the low-rank matrix completion problem. Numerical experiments on synthetic and real-world datasets illustrate that the proposed rank-adaptive method compares favorably with state-of-the-art algorithms. In addition, it shows that one can incorporate each aspect of this rank-adaptive framework separately into existing algorithms for the purpose of improving performance.
In this article we study the validity of the Whitney $C^1$ extension property for horizontal curves in sub-Riemannian manifolds endowed with 1-jets that satisfy a first-order Taylor expansion compatibility condition. We first consider the equiregular case, where we show that the extension property holds true whenever a suitable non-singularity property holds for the input-output maps on the Carnot groups obtained by nilpotent approximation. We then discuss the case of sub-Riemannian manifolds with singular points and we show that all step-2 manifolds satisfy the $C^1$ extension property. We conclude by showing that the $C^1$ extension property implies a Lusin-like approximation theorem for horizontal curves on sub-Riemannian manifolds.
We prove that sub-Riemannian manifolds are infinitesimally Hilbertian (i.e., the associated Sobolev space is Hilbert) when equipped with an arbitrary Radon measure. The result follows from an embedding of metric derivations into the space of square-integrable sections of the horizontal bundle, which we obtain on all weighted sub-Finsler manifolds. As an intermediate tool, of independent interest, we show that any sub-Finsler distance can be monotonically approximated from below by Finsler ones. All the results are obtained in the general setting of possibly rank-varying structures.
In scientific computing and machine learning applications, matrices and more general multidimensional arrays (tensors) can often be approximated with the help of low-rank decompositions. Since matrices and tensors of fixed rank form smooth Riemannian manifolds, one of the popular tools for finding the low-rank approximations is to use the Riemannian optimization. Nevertheless, efficient implementation of Riemannian gradients and Hessians, required in Riemannian optimization algorithms, can be a nontrivial task in practice. Moreover, in some cases, analytic formulas are not even available. In this paper, we build upon automatic differentiation and propose a method that, given an implementation of the function to be minimized, efficiently computes Riemannian gradients and matrix-by-vector products between approximate Riemannian Hessian and a given vector.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا