ترغب بنشر مسار تعليمي؟ اضغط هنا

On the regularity of abnormal minimizers for rank $2$ sub-Riemannian structures

195   0   0.0 ( 0 )
 نشر من قبل Davide Barilari
 تاريخ النشر 2018
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

We prove the $C^{1}$ regularity for a class of abnormal length-minimizers in rank $2$ sub-Riemannian structures. As a consequence of our result, all length-minimizers for rank $2$ sub-Riemannian structures of step up to $4$ are of class $C^{1}$.



قيم البحث

اقرأ أيضاً

A helical CR structure is a decomposition of a real Euclidean space into an even-dimensional horizontal subspace and its orthogonal vertical complement, together with an almost complex structure on the horizontal space and a marked vector in the vert ical space. We prove an equivalence between such structures and step two Carnot groups equipped with a distinguished normal geodesic, and also between such structures and smooth real curves whose derivatives have constant Euclidean norm. As a consequence, we relate step two Carnot groups equipped with sub-Riemannian geodesics with this family of curves. The restriction to the unit circle of certain planar homogeneous polynomial mappings gives an instructive class of examples. We describe these examples in detail.
150 - Bin Gao , P.-A. Absil 2021
The low-rank matrix completion problem can be solved by Riemannian optimization on a fixed-rank manifold. However, a drawback of the known approaches is that the rank parameter has to be fixed a priori. In this paper, we consider the optimization pro blem on the set of bounded-rank matrices. We propose a Riemannian rank-adaptive method, which consists of fixed-rank optimization, rank increase step and rank reduction step. We explore its performance applied to the low-rank matrix completion problem. Numerical experiments on synthetic and real-world datasets illustrate that the proposed rank-adaptive method compares favorably with state-of-the-art algorithms. In addition, it shows that one can incorporate each aspect of this rank-adaptive framework separately into existing algorithms for the purpose of improving performance.
In this article we study the validity of the Whitney $C^1$ extension property for horizontal curves in sub-Riemannian manifolds endowed with 1-jets that satisfy a first-order Taylor expansion compatibility condition. We first consider the equiregular case, where we show that the extension property holds true whenever a suitable non-singularity property holds for the input-output maps on the Carnot groups obtained by nilpotent approximation. We then discuss the case of sub-Riemannian manifolds with singular points and we show that all step-2 manifolds satisfy the $C^1$ extension property. We conclude by showing that the $C^1$ extension property implies a Lusin-like approximation theorem for horizontal curves on sub-Riemannian manifolds.
We prove that sub-Riemannian manifolds are infinitesimally Hilbertian (i.e., the associated Sobolev space is Hilbert) when equipped with an arbitrary Radon measure. The result follows from an embedding of metric derivations into the space of square-i ntegrable sections of the horizontal bundle, which we obtain on all weighted sub-Finsler manifolds. As an intermediate tool, of independent interest, we show that any sub-Finsler distance can be monotonically approximated from below by Finsler ones. All the results are obtained in the general setting of possibly rank-varying structures.
In scientific computing and machine learning applications, matrices and more general multidimensional arrays (tensors) can often be approximated with the help of low-rank decompositions. Since matrices and tensors of fixed rank form smooth Riemannian manifolds, one of the popular tools for finding the low-rank approximations is to use the Riemannian optimization. Nevertheless, efficient implementation of Riemannian gradients and Hessians, required in Riemannian optimization algorithms, can be a nontrivial task in practice. Moreover, in some cases, analytic formulas are not even available. In this paper, we build upon automatic differentiation and propose a method that, given an implementation of the function to be minimized, efficiently computes Riemannian gradients and matrix-by-vector products between approximate Riemannian Hessian and a given vector.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا