ترغب بنشر مسار تعليمي؟ اضغط هنا

Polynomial Regression on Riemannian Manifolds

227   0   0.0 ( 0 )
 نشر من قبل Jacob Hinkle
 تاريخ النشر 2012
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

In this paper we develop the theory of parametric polynomial regression in Riemannian manifolds and Lie groups. We show application of Riemannian polynomial regression to shape analysis in Kendall shape space. Results are presented, showing the power of polynomial regression on the classic rat skull growth data of Bookstein as well as the analysis of the shape changes associated with aging of the corpus callosum from the OASIS Alzheimers study.



قيم البحث

اقرأ أيضاً

We study the propagator of the wave equation on a closed Riemannian manifold $M$. We propose a geometric approach to the construction of the propagator as a single oscillatory integral global both in space and in time with a distinguished complex-val ued phase function. This enables us to provide a global invariant definition of the full symbol of the propagator - a scalar function on the cotangent bundle - and an algorithm for the explicit calculation of its homogeneous components. The central part of the paper is devoted to the detailed analysis of the subprincipal symbol; in particular, we derive its explicit small time asymptotic expansion. We present a general geometric construction that allows one to visualise topological obstructions and describe their circumvention with the use of a complex-valued phase function. We illustrate the general framework with explicit examples in dimension two.
Functional data analysis on nonlinear manifolds has drawn recent interest. Sphere-valued functional data, which are encountered for example as movement trajectories on the surface of the earth, are an important special case. We consider an intrinsic principal component analysis for smooth Riemannian manifold-valued functional data and study its asymptotic properties. Riemannian functional principal component analysis (RFPCA) is carried out by first mapping the manifold-valued data through Riemannian logarithm maps to tangent spaces around the time-varying Frechet mean function, and then performing a classical multivariate functional principal component analysis on the linear tangent spaces. Representations of the Riemannian manifold-valued functions and the eigenfunctions on the original manifold are then obtained with exponential maps. The tangent-space approximation through functional principal component analysis is shown to be well-behaved in terms of controlling the residual variation if the Riemannian manifold has nonnegative curvature. Specifically, we derive a central limit theorem for the mean function, as well as root-$n$ uniform convergence rates for other model components, including the covariance function, eigenfunctions, and functional principal component scores. Our applications include a novel framework for the analysis of longitudinal compositional data, achieved by mapping longitudinal compositional data to trajectories on the sphere, illustrated with longitudinal fruit fly behavior patterns. RFPCA is shown to be superior in terms of trajectory recovery in comparison to an unrestricted functional principal component analysis in applications and simulations and is also found to produce principal component scores that are better predictors for classification compared to traditional functional functional principal component scores.
The Euclidean scattering transform was introduced nearly a decade ago to improve the mathematical understanding of convolutional neural networks. Inspired by recent interest in geometric deep learning, which aims to generalize convolutional neural ne tworks to manifold and graph-structured domains, we define a geometric scattering transform on manifolds. Similar to the Euclidean scattering transform, the geometric scattering transform is based on a cascade of wavelet filters and pointwise nonlinearities. It is invariant to local isometries and stable to certain types of diffeomorphisms. Empirical results demonstrate its utility on several geometric learning tasks. Our results generalize the deformation stability and local translation invariance of Euclidean scattering, and demonstrate the importance of linking the used filter structures to the underlying geometry of the data.
248 - Genqian Liu , Xiaoming Tan 2021
This paper is devoted to investigate the heat trace asymptotic expansion corresponding to the magnetic Steklov eigenvalue problem on Riemannian manifolds with boundary. We establish an effective procedure, by which we can calculate all the coefficien ts $a_0$, $a_1$, $dots$, $a_{n-1}$ of the heat trace asymptotic expansion. In particular, we explicitly give the expressions for the first four coefficients. These coefficients are spectral invariants which provide precise information concerning the volume and curvatures of the boundary of the manifold and some physical quantities by the magnetic Steklov eigenvalues.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا