No Arabic abstract
The inductive biases of graph representation learning algorithms are often encoded in the background geometry of their embedding space. In this paper, we show that general directed graphs can be effectively represented by an embedding model that combines three components: a pseudo-Riemannian metric structure, a non-trivial global topology, and a unique likelihood function that explicitly incorporates a preferred direction in embedding space. We demonstrate the representational capabilities of this method by applying it to the task of link prediction on a series of synthetic and real directed graphs from natural language applications and biology. In particular, we show that low-dimensional cylindrical Minkowski and anti-de Sitter spacetimes can produce equal or better graph representations than curved Riemannian manifolds of higher dimensions.
Knowledge graph embeddings rank among the most successful methods for link prediction in knowledge graphs, i.e., the task of completing an incomplete collection of relational facts. A downside of these models is their strong sensitivity to model hyperparameters, in particular regularizers, which have to be extensively tuned to reach good performance [Kadlec et al., 2017]. We propose an efficient method for large scale hyperparameter tuning by interpreting these models in a probabilistic framework. After a model augmentation that introduces per-entity hyperparameters, we use a variational expectation-maximization approach to tune thousands of such hyperparameters with minimal additional cost. Our approach is agnostic to details of the model and results in a new state of the art in link prediction on standard benchmark data.
We exhibit several families of Jacobi-Videv pseudo-Riemannian manifolds which are not Einstein. We also exhibit Jacobi-Videv algebraic curvature tensors where the Ricci operator defines an almost complex structure.
In this paper, we discuss the heat flow of a pseudo-harmonic map from a closed pseudo-Hermitian manifold to a Riemannian manifold with non-positive sectional curvature, and prove the existence of the pseudo-harmonic map which is a generalization of Eells-Sampsons existence theorem. We also discuss the uniqueness of the pseudo-harmonic representative of its homotopy class which is a generalization of Hartman theorem, provided that the target manifold has negative sectional curvature.
The Euclidean scattering transform was introduced nearly a decade ago to improve the mathematical understanding of convolutional neural networks. Inspired by recent interest in geometric deep learning, which aims to generalize convolutional neural networks to manifold and graph-structured domains, we define a geometric scattering transform on manifolds. Similar to the Euclidean scattering transform, the geometric scattering transform is based on a cascade of wavelet filters and pointwise nonlinearities. It is invariant to local isometries and stable to certain types of diffeomorphisms. Empirical results demonstrate its utility on several geometric learning tasks. Our results generalize the deformation stability and local translation invariance of Euclidean scattering, and demonstrate the importance of linking the used filter structures to the underlying geometry of the data.
In this paper, we derived biharmonic equations for pseudo-Riemannian submanifolds of pseudo-Riemannian manifolds which includes the biharmonic equations for submanifolds of Riemannian manifolds as a special case. As applications, we proved that a pseudo-umbilical biharmonic pseudo-Riemannian submanifold of a pseudo-Riemannian manifold has constant mean curvature, we completed the classifications of biharmonic pseudo-Riemannian hypersurfaces with at most two distinct principal curvatures, which were used to give four construction methods to produce proper biharmonic pseudo-Riemannian submanifolds from minimal submanifolds. We also made some comparison study between biharmonic hypersurfaces of Riemannian space forms and the space-like biharmonic hypersurfaces of pseudo-Riemannian space forms.