ترغب بنشر مسار تعليمي؟ اضغط هنا

Fast algorithms for anti-distance matrices as a generalization of Boolean matrices

63   0   0.0 ( 0 )
 نشر من قبل Michiel de Bondt
 تاريخ النشر 2017
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English
 تأليف Michiel de Bondt




اسأل ChatGPT حول البحث

We show that Boolean matrix multiplication, computed as a sum of products of column vectors with row vectors, is essentially the same as Warshalls algorithm for computing the transitive closure matrix of a graph from its adjacency matrix. Warshalls algorithm can be generalized to Floyds algorithm for computing the distance matrix of a graph with weighted edges. We will generalize Boolean matrices in the same way, keeping matrix multiplication essentially equivalent to the Floyd-Warshall algorithm. This way, we get matrices over a semiring, which are similar to the so-called funny matrices. We discuss our implementation of operations on Boolean matrices and on their generalization, which make use of vector instructions.

قيم البحث

اقرأ أيضاً

We present a parallel algorithm for computing the approximate factorization of an $N$-by-$N$ kernel matrix. Once this factorization has been constructed (with $N log^2 N $ work), we can solve linear systems with this matrix with $N log N $ work. Kern el matrices represent pairwise interactions of points in metric spaces. They appear in machine learning, approximation theory, and computational physics. Kernel matrices are typically dense (matrix multiplication scales quadratically with $N$) and ill-conditioned (solves can require 100s of Krylov iterations). Thus, fast algorithms for matrix multiplication and factorization are critical for scalability. Recently we introduced ASKIT, a new method for approximating a kernel matrix that resembles N-body methods. Here we introduce INV-ASKIT, a factorization scheme based on ASKIT. We describe the new method, derive complexity estimates, and conduct an empirical study of its accuracy and scalability. We report results on real-world datasets including COVTYPE ($0.5$M points in 54 dimensions), SUSY ($4.5$M points in 8 dimensions) and MNIST (2M points in 784 dimensions) using shared and distributed memory parallelism. In our largest run we approximately factorize a dense matrix of size 32M $times$ 32M (generated from points in 64 dimensions) on 4,096 Sandy-Bridge cores. To our knowledge these results improve the state of the art by several orders of magnitude.
Euclidean distance matrices (EDM) are matrices of squared distances between points. The definition is deceivingly simple: thanks to their many useful properties they have found applications in psychometrics, crystallography, machine learning, wireles s sensor networks, acoustics, and more. Despite the usefulness of EDMs, they seem to be insufficiently known in the signal processing community. Our goal is to rectify this mishap in a concise tutorial. We review the fundamental properties of EDMs, such as rank or (non)definiteness. We show how various EDM properties can be used to design algorithms for completing and denoising distance data. Along the way, we demonstrate applications to microphone position calibration, ultrasound tomography, room reconstruction from echoes and phase retrieval. By spelling out the essential algorithms, we hope to fast-track the readers in applying EDMs to their own problems. Matlab code for all the described algorithms, and to generate the figures in the paper, is available online. Finally, we suggest directions for further research.
79 - Yuan Sun , Shiwei Ye , Yi Sun 2015
An arbitrary $mtimes n$ Boolean matrix $M$ can be decomposed {em exactly} as $M =Ucirc V$, where $U$ (resp. $V$) is an $mtimes k$ (resp. $ktimes n$) Boolean matrix and $circ$ denotes the Boolean matrix multiplication operator. We first prove an exact formula for the Boolean matrix $J$ such that $M =Mcirc J^T$ holds, where $J$ is maximal in the sense that if any 0 element in $J$ is changed to a 1 then this equality no longer holds. Since minimizing $k$ is NP-hard, we propose two heuristic algorithms for finding suboptimal but good decomposition. We measure the performance (in minimizing $k$) of our algorithms on several real datasets in comparison with other representative heuristic algorithms for Boolean matrix decomposition (BMD). The results on some popular benchmark datasets demonstrate that one of our proposed algorithms performs as well or better on most of them. Our algorithms have a number of other advantages: They are based on exact mathematical formula, which can be interpreted intuitively. They can be used for approximation as well with competitive coverage. Last but not least, they also run very fast. Due to interpretability issues in data mining, we impose the condition, called the column use condition, that the columns of the factor matrix $U$ must form a subset of the columns of $M$.
Euclidean distance matrices (EDMs) are a major tool for localization from distances, with applications ranging from protein structure determination to global positioning and manifold learning. They are, however, static objects which serve to localize points from a snapshot of distances. If the objects move, one expects to do better by modeling the motion. In this paper, we introduce Kinetic Euclidean Distance Matrices (KEDMs)---a new kind of time-dependent distance matrices that incorporate motion. The entries of KEDMs become functions of time, the squared time-varying distances. We study two smooth trajectory models---polynomial and bandlimited trajectories---and show that these trajectories can be reconstructed from incomplete, noisy distance observations, scattered over multiple time instants. Our main contribution is a semidefinite relaxation (SDR), inspired by SDRs for static EDMs. Similarly to the static case, the SDR is followed by a spectral factorization step; however, because spectral factorization of polynomial matrices is more challenging than for constant matrices, we propose a new factorization method that uses anchor measurements. Extensive numerical experiments show that KEDMs and the new semidefinite relaxation accurately reconstruct trajectories from noisy, incomplete distance data and that, in fact, motion improves rather than degrades localization if properly modeled. This makes KEDMs a promising tool for problems in geometry of dynamic points sets.
In this paper, the interval-valued intuitionistic fuzzy matrix (IVIFM) is introduced. The interval-valued intuitionistic fuzzy determinant is also defined. Some fundamental operations are also presented. The need of IVIFM is explain by an example.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا