Manifold hypotheses are typically used for tasks such as dimensionality reduction, interpolation, or improving classification performance. In the less common problem of manifold estimation, the task is to characterize the geometric structure of the m
anifold in the original ambient space from a sample. We focus on the role that tangent bundle learners (TBL) can play in estimating the underlying manifold from which data is assumed to be sampled. Since the unbounded tangent spaces natively represent a poor manifold estimate, the problem reduces to one of estimating regions in the tangent space where it acts as a relatively faithful linear approximator to the surface of the manifold. Local PCA methods, such as the Mixtures of Probabilistic Principal Component Analyzers method of Tipping and Bishop produce a subset of the tangent bundle of the manifold along with an assignment function that assigns points in the training data used by the TBL to elements of the estimated tangent bundle. We formulate three methods that use the data assigned to each tangent space to estimate the underlying bounded subspaces for which the tangent space is a faithful estimate of the manifold and offer thoughts on how this perspective is theoretically grounded in the manifold assumption. We seek to explore the conceptual and technical challenges that arise in trying to utilize simple TBL methods to arrive at reliable estimates of the underlying manifold.
A common problem in Bayesian inference is the sampling of target probability distributions at sufficient resolution and accuracy to estimate the probability density, and to compute credible regions. Often by construction, many target distributions ca
n be expressed as some higher-dimensional closed-form distribution with parametrically constrained variables, i.e., one that is restricted to a smooth submanifold of Euclidean space. I propose a derivative-based importance sampling framework for such distributions. A base set of $n$ samples from the target distribution is used to map out the tangent bundle of the manifold, and to seed $nm$ additional points that are projected onto the tangent bundle and weighted appropriately. The method essentially acts as an upsampling complement to any standard algorithm. It is designed for the efficient production of approximate high-resolution histograms from manifold-restricted Gaussian distributions, and can provide large computational savings when sampling directly from the target distribution is expensive.
Recall that two geodesics in a negatively curved surface $S$ are of the same type if their free homotopy classes differ by a homeomorphism of the surface. In this note we study the distribution in the unit tangent bundle of the geodesics of fixed typ
e, proving that they are asymptotically equidistributed with respect to a certain measure $mathfrak{m}^S$ on $T^1S$. We study a few properties of this measure, showing for example that it distinguishes between hyperbolic surfaces.
We find a new class of invariant metrics existing on the tangent bundle of any given almost-Hermitian manifold. We focus here on the case of Riemannian surfaces, which yield new examples of Kahlerian Ricci-flat manifolds in four real dimensions.
We prove that the horizontal and vertical distributions of the tangent bundle with the Sasaki metric are isocline, the distributions given by the kernels of the horizontal and vertical lifts of the contact form $omega$ from the Heisenberg manifold $(
H_3,g)$ to $(TH_3,g^S)$ are not totally geodesic, and the distributions $F^H=L(E_1^H,E_2^H)$ and $F^V=L(E_1^V,E_2^V)$ are totally geodesic, but they are not isocline. We obtain that the horizontal and natural lifts of the curves from the Heisenberg manifold $(H_3,g)$, are geodesics in the tangent bundle endowed with the Sasaki metric $(TH_3,g^s)$, if and only if the curves considered on the base manifold are geodesics. Then, we get two particular examples of geodesics from $(TH_3,g^s)$, which are not horizontal or natural lifts of geodesics from the base manifold $(H_3,g)$.