ترغب بنشر مسار تعليمي؟ اضغط هنا

MLSEB: Edge Bundling using Moving Least Squares Approximation

100   0   0.0 ( 0 )
 نشر من قبل Jieting Wu
 تاريخ النشر 2017
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Edge bundling methods can effectively alleviate visual clutter and reveal high-level graph structures in large graph visualization. Researchers have devoted significant efforts to improve edge bundling according to different metrics. As the edge bundling family evolve rapidly, the quality of edge bundles receives increasing attention in the literature accordingly. In this paper, we present MLSEB, a novel method to generate edge bundles based on moving least squares (MLS) approximation. In comparison with previous edge bundling methods, we argue that our MLSEB approach can generate better results based on a quantitative metric of quality, and also ensure scalability and the efficiency for visualizing large graphs.

قيم البحث

اقرأ أيضاً

61 - Barak Sober , David Levin 2016
In order to avoid the curse of dimensionality, frequently encountered in Big Data analysis, there was a vast development in the field of linear and nonlinear dimension reduction techniques in recent years. These techniques (sometimes referred to as m anifold learning) assume that the scattered input data is lying on a lower dimensional manifold, thus the high dimensionality problem can be overcome by learning the lower dimensionality behavior. However, in real life applications, data is often very noisy. In this work, we propose a method to approximate $mathcal{M}$ a $d$-dimensional $C^{m+1}$ smooth submanifold of $mathbb{R}^n$ ($d ll n$) based upon noisy scattered data points (i.e., a data cloud). We assume that the data points are located near the lower dimensional manifold and suggest a non-linear moving least-squares projection on an approximating $d$-dimensional manifold. Under some mild assumptions, the resulting approximant is shown to be infinitely smooth and of high approximation order (i.e., $O(h^{m+1})$, where $h$ is the fill distance and $m$ is the degree of the local polynomial approximation). The method presented here assumes no analytic knowledge of the approximated manifold and the approximation algorithm is linear in the large dimension $n$. Furthermore, the approximating manifold can serve as a framework to perform operations directly on the high dimensional data in a computationally efficient manner. This way, the preparatory step of dimension reduction, which induces distortions to the data, can be avoided altogether.
Edge bundling techniques cluster edges with similar attributes (i.e. similarity in direction and proximity) together to reduce the visual clutter. All edge bundling techniques to date implicitly or explicitly cluster groups of individual edges, or pa rts of them, together based on these attributes. These clusters can result in ambiguous connections that do not exist in the data. Confluent drawings of networks do not have these ambiguities, but require the layout to be computed as part of the bundling process. We devise a new bundling method, Edge-Path bundling, to simplify edge clutter while greatly reducing ambiguities compared to previous bundling techniques. Edge-Path bundling takes a layout as input and clusters each edge along a weighted, shortest path to limit its deviation from a straight line. Edge-Path bundling does not incur independent edge ambiguities typically seen in all edge bundling methods, and the level of bundling can be tuned through shortest path distances, Euclidean distances, and combinations of the two. Also, directed edge bundling naturally emerges from the model. Through metric evaluations, we demonstrate the advantages of Edge-Path bundling over other techniques.
We present an algorithm for approximating a function defined over a $d$-dimensional manifold utilizing only noisy function values at locations sampled from the manifold with noise. To produce the approximation we do not require any knowledge regardin g the manifold other than its dimension $d$. We use the Manifold Moving Least-Squares approach of (Sober and Levin 2016) to reconstruct the atlas of charts and the approximation is built on-top of those charts. The resulting approximant is shown to be a function defined over a neighborhood of a manifold, approximating the originally sampled manifold. In other words, given a new point, located near the manifold, the approximation can be evaluated directly on that point. We prove that our construction yields a smooth function, and in case of noiseless samples the approximation order is $mathcal{O}(h^{m+1})$, where $h$ is a local density of sample parameter (i.e., the fill distance) and $m$ is the degree of a local polynomial approximation, used in our algorithm. In addition, the proposed algorithm has linear time complexity with respect to the ambient-spaces dimension. Thus, we are able to avoid the computational complexity, commonly encountered in high dimensional approximations, without having to perform non-linear dimension reduction, which inevitably introduces distortions to the geometry of the data. Additionaly, we show numerical experiments that the proposed approach compares favorably to statistical approaches for regression over manifolds and show its potential.
The medial axis transform has applications in numerous fields including visualization, computer graphics, and computer vision. Unfortunately, traditional medial axis transformations are usually brittle in the presence of outliers, perturbations and/o r noise along the boundary of objects. To overcome this limitation, we introduce a new formulation of the medial axis transform which is naturally robust in the presence of these artifacts. Unlike previous work which has approached the medial axis from a computational geometry angle, we consider it from a numerical optimization perspective. In this work, we follow the definition of the medial axis transform as the set of maximally inscribed spheres. We show how this definition can be formulated as a least squares relaxation where the transform is obtained by minimizing a continuous optimization problem. The proposed approach is inherently parallelizable by performing independant optimization of each sphere using Gauss-Newton, and its least-squares form allows it to be significantly more robust compared to traditional computational geometry approaches. Extensive experiments on 2D and 3D objects demonstrate that our method provides superior results to the state of the art on both synthetic and real-data.
In this paper we consider two sources of enhancement for the meshfree Lagrangian particle method smoothed particle hydrodynamics (SPH) by improving the accuracy of the particle approximation. Namely, we will consider shape functions constructed using : moving least-squares approximation (MLS); radial basis functions (RBF). Using MLS approximation is appealing because polynomial consistency of the particle approximation can be enforced. RBFs further appeal as they allow one to dispense with the smoothing-length -- the parameter in the SPH method which governs the number of particles within the support of the shape function. Currently, only ad hoc methods for choosing the smoothing-length exist. We ensure that any enhancement retains the conservative and meshfree nature of SPH. In doing so, we derive a new set of variationally-consistent hydrodynamic equations. Finally, we demonstrate the performance of the new equations on the Sod shock tube problem.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا