Do you want to publish a course? Click here

Gromov-Hausdorff Approximation of Metric Spaces with Linear Structure

139   0   0.0 ( 0 )
 Added by Frederic Chazal
 Publication date 2013
and research's language is English




Ask ChatGPT about the research

In many real-world applications data come as discrete metric spaces sampled around 1-dimensional filamentary structures that can be seen as metric graphs. In this paper we address the metric reconstruction problem of such filamentary structures from data sampled around them. We prove that they can be approximated, with respect to the Gromov-Hausdorff distance by well-chosen Reeb graphs (and some of their variants) and we provide an efficient and easy to implement algorithm to compute such approximations in almost linear time. We illustrate the performances of our algorithm on a few synthetic and real data sets.



rate research

Read More

The Vapnik-Chervonenkis dimension provides a notion of complexity for systems of sets. If the VC dimension is small, then knowing this can drastically simplify fundamental computational tasks such as classification, range counting, and density estimation through the use of sampling bounds. We analyze set systems where the ground set $X$ is a set of polygonal curves in $mathbb{R}^d$ and the sets $mathcal{R}$ are metric balls defined by curve similarity metrics, such as the Frechet distance and the Hausdorff distance, as well as their discrete counterparts. We derive upper and lower bounds on the VC dimension that imply useful sampling bounds in the setting that the number of curves is large, but the complexity of the individual curves is small. Our upper bounds are either near-quadratic or near-linear in the complexity of the curves that define the ranges and they are logarithmic in the complexity of the curves that define the ground set.
One of the central notions to emerge from the study of persistent homology is that of interleaving distance. It has found recent applications in symplectic and contact geometry, sheaf theory, computational geometry, and phylogenetics. Here we present a general study of this topic. We define interleaving of functors with common codomain as solutions to an extension problem. In order to define interleaving distance in this setting we are led to categorical generalizations of Hausdorff distance, Gromov-Hausdorff distance, and the space of metric spaces. We obtain comparisons with previous notions of interleaving via the study of future equivalences. As an application we recover a definition of shift equivalences of discrete dynamical systems.
We study quasi-isometry invariants of Gromov hyperbolic spaces, focussing on the l_p-cohomology and closely related invariants such as the conformal dimension, combinatorial modulus, and the Combinatorial Loewner Property. We give new constructions of continuous l_p-cohomology, thereby obtaining information about the l_p-equivalence relation, as well as critical exponents associated with l_p-cohomology. As an application, we provide a flexible construction of hyperbolic groups which do not have the Combinatorial Loewner Property, extending and complementing earlier examples. Another consequence is the existence of hyperbolic groups with Sierpinski carpet boundary which have conformal dimension arbitrarily close to 1. In particular, we answer questions of Mario Bonk and John Mackay.
In this note we prove in the nonlinear setting of $CD(K,infty)$ spaces the stability of the Krasnoselskii spectrum of the Laplace operator $-Delta$ under measured Gromov-Hausdorff convergence, under an additional compactness assumption satisfied, for instance, by sequences of $CD^*(K,N)$ metric measure spaces with uniformly bounded diameter. Additionally, we show that every element $lambda$ in the Krasnoselskii spectrum is indeed an eigenvalue, namely there exists a nontrivial $u$ satisfying the eigenvalue equation $- Delta u = lambda u$.
Magnitude is a numerical invariant of enriched categories, including in particular metric spaces as $[0,infty)$-enriched categories. We show that in many cases magnitude can be categorified to a homology theory for enriched categories, which we call magnitude homology (in fact, it is a special sort of Hochschild homology), whose graded Euler characteristic is the magnitude. Magnitude homology of metric spaces generalizes the Hepworth--Willerton magnitude homology of graphs, and detects geometric information such as convexity.

suggested questions

comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا