ﻻ يوجد ملخص باللغة العربية
Over a complete Riemannian manifold of finite dimension, Greene and Wu introduced a convolution, known as Greene-Wu (GW) convolution. In this paper, we study properties of the GW convolution and apply it to non-Euclidean machine learning problems. In particular, we derive a new formula for how the curvature of the space would affect the curvature of the function through the GW convolution. Also, following the study of the GW convolution, a new method for gradient estimation over Riemannian manifolds is introduced.
In this paper, we discuss the heat flow of a pseudo-harmonic map from a closed pseudo-Hermitian manifold to a Riemannian manifold with non-positive sectional curvature, and prove the existence of the pseudo-harmonic map which is a generalization of E
We study gradient-based regularization methods for neural networks. We mainly focus on two regularization methods: the total variation and the Tikhonov regularization. Applying these methods is equivalent to using neural networks to solve some partia
We study the convergence issue for the gradient algorithm (employing general step sizes) for optimization problems on general Riemannian manifolds (without curvature constraints). Under the assumption of the local convexity/quasi-convexity (resp. wea
Thanks to the combination of state-of-the-art accelerators and highly optimized open software frameworks, there has been tremendous progress in the performance of deep neural networks. While these developments have been responsible for many breakthro
We introduce and implement a method to compute stationary states of nonlinear Schrodinger equations on metric graphs. Stationary states are obtained as local minimizers of the nonlinear Schrodinger energy at fixed mass. Our method is based on a norma