No Arabic abstract
We review the theory of, and develop algorithms for transforming a finite point set in ${bf R}^d$ into a set in emph{radial isotropic position} by a nonsingular linear transformation followed by rescaling each image point to the unit sphere. This problem arises in a wide spectrum of applications in computer science and mathematics. Our algorithms use gradient descent methods for a particular convex function $f$ whose minimum defines the transformation, and our main focus is on analyzing their performance. Although the minimum can be computed exactly, by expensive symbolic algebra techniques, gradient descent only approximates the desired minimum to any desired level of accuracy. We show that computing the gradient of $f$ amounts to computing the Singular Value Decomposition (SVD) of a certain matrix associated with the input set, making it simple to implement. We believe it to be superior to other approximate techniques (mainly the ellipsoid algorithm) used previously to find this transformation, and it should run much faster in practice. We prove that $f$ is smooth, which yields convergence rate proportional to $1/epsilon$, where $epsilon$ is the desired approximation accuracy. To complete the analysis, we provide upper bounds on the norm of the optimal solution which depend on new parameters measuring the degeneracy in our input. We believe that our parameters capture degeneracy better than other, seemingly weaker, parameters used in previous works. We next analyze the strong convexity of $f$, and present two worst-case lower bounds on the smallest eigenvalue of its Hessian. This gives another worst-case bound on the convergence rate of another variant of gradient decent that depends only logarithmically on $1/epsilon$.
In this position paper we suggest a possible metric approach to shape comparison that is based on a mathematical formalization of the concept of observer, seen as a collection of suitable operators acting on a metric space of functions. These functions represent the set of data that are accessible to the observer, while the operators describe the way the observer elaborates the data and enclose the invariance that he/she associates with them. We expose this model and illustrate some theoretical reasons that justify its possible use for shape comparison.
Consider a graph with a rotation system, namely, for every vertex, a circular ordering of the incident edges. Given such a graph, an angle cover maps every vertex to a pair of consecutive edges in the ordering -- an angle -- such that each edge participates in at least one such pair. We show that any graph of maximum degree 4 admits an angle cover, give a poly-time algorithm for deciding if a graph with no degree-3 vertices has an angle-cover, and prove that, given a graph of maximum degree 5, it is NP-hard to decide whether it admits an angle cover. We also consider extensions of the angle cover problem where every vertex selects a fixed number $a>1$ of angles or where an angle consists of more than two consecutive edges. We show an application of angle covers to the problem of deciding if the 2-blowup of a planar graph has isomorphic thickness 2.
We consider the continuous Fermat-Weber problem, where the customers are continuously (uniformly) distributed along the boundary of a convex polygon. We derive the closed-form expression for finding the average distance from a given point to the continuously distributed customers along the boundary. A Weiszfeld-type procedure is proposed for this model, which is shown to be linearly convergent. We also derive a closed-form formula to find the average distance for a given point to the entire convex polygon, assuming a uniform distribution. Since the function is smooth, convex, and explicitly given, the continuous version of the Fermat-Weber problem over a convex polygon can be solved easily by numerical algorithms.
The Voronoi diagram of a finite set of objects is a fundamental geometric structure that subdivides the embedding space into regions, each region consisting of the points that are closer to a given object than to the others. We may define many variants of Voronoi diagrams depending on the class of objects, the distance functions and the embedding space. In this paper, we investigate a framework for defining and building Voronoi diagrams for a broad class of distance functions called Bregman divergences. Bregman divergences include not only the traditional (squared) Euclidean distance but also various divergence measures based on entropic functions. Accordingly, Bregman Voronoi diagrams allow to define information-theoretic Voronoi diagrams in statistical parametric spaces based on the relative entropy of distributions. We define several types of Bregman diagrams, establish correspondences between those diagrams (using the Legendre transformation), and show how to compute them efficiently. We also introduce extensions of these diagrams, e.g. k-order and k-bag Bregman Voronoi diagrams, and introduce Bregman triangulations of a set of points and their connexion with Bregman Voronoi diagrams. We show that these triangulations capture many of the properties of the celebrated Delaunay triangulation. Finally, we give some applications of Bregman Voronoi diagrams which are of interest in the context of computational geometry and machine learning.
We consider the problem of approximating a two-dimensional shape contour (or curve segment) using discrete assembly systems, which allow to build geometric structures based on limited sets of node and edge types subject to edge length and orientation restrictions. We show that already deciding feasibility of such approximation problems is NP-hard, and remains intractable even for very simple setups. We then devise an algorithmic framework that combines shape sampling with exact cardinality-minimization to obtain good approximations using few components. As a particular application and showcase example, we discuss approximating shape contours using the classical Zometool construction kit and provide promising computational results, demonstrating that our algorithm is capable of obtaining good shape representations within reasonable time, in spite of the problems general intractability. We conclude the paper with an outlook on possible extensions of the developed methodology, in particular regarding 3D shape approximation tasks.