ترغب بنشر مسار تعليمي؟ اضغط هنا

Graph rigidity, Cyclic Belief Propagation and Point Pattern Matching

201   0   0.0 ( 0 )
 نشر من قبل Julian McAuley
 تاريخ النشر 2007
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

A recent paper cite{CaeCaeSchBar06} proposed a provably optimal, polynomial time method for performing near-isometric point pattern matching by means of exact probabilistic inference in a chordal graphical model. Their fundamental result is that the chordal graph in question is shown to be globally rigid, implying that exact inference provides the same matching solution as exact inference in a complete graphical model. This implies that the algorithm is optimal when there is no noise in the point patterns. In this paper, we present a new graph which is also globally rigid but has an advantage over the graph proposed in cite{CaeCaeSchBar06}: its maximal clique size is smaller, rendering inference significantly more efficient. However, our graph is not chordal and thus standard Junction Tree algorithms cannot be directly applied. Nevertheless, we show that loopy belief propagation in such a graph converges to the optimal solution. This allows us to retain the optimality guarantee in the noiseless case, while substantially reducing both memory requirements and processing time. Our experimental results show that the accuracy of the proposed solution is indistinguishable from that of cite{CaeCaeSchBar06} when there is noise in the point patterns.



قيم البحث

اقرأ أيضاً

94 - Sungsoo Ahn 2015
Max-product Belief Propagation (BP) is a popular message-passing algorithm for computing a Maximum-A-Posteriori (MAP) assignment over a distribution represented by a Graphical Model (GM). It has been shown that BP can solve a number of combinatorial optimization problems including minimum weight matching, shortest path, network flow and vertex cover under the following common assumption: the respective Linear Programming (LP) relaxation is tight, i.e., no integrality gap is present. However, when LP shows an integrality gap, no model has been known which can be solved systematically via sequential applications of BP. In this paper, we develop the first such algorithm, coined Blossom-BP, for solving the minimum weight matching problem over arbitrary graphs. Each step of the sequential algorithm requires applying BP over a modified graph constructed by contractions and expansions of blossoms, i.e., odd sets of vertices. Our scheme guarantees termination in O(n^2) of BP runs, where n is the number of vertices in the original graph. In essence, the Blossom-BP offers a distributed version of the celebrated Edmonds Blossom algorithm by jumping at once over many sub-steps with a single BP. Moreover, our result provides an interpretation of the Edmonds algorithm as a sequence of LPs.
Inference in continuous label Markov random fields is a challenging task. We use particle belief propagation (PBP) for solving the inference problem in continuous label space. Sampling particles from the belief distribution is typically done by using Metropolis-Hastings Markov chain Monte Carlo methods which involves sampling from a proposal distribution. This proposal distribution has to be carefully designed depending on the particular model and input data to achieve fast convergence. We propose to avoid dependence on a proposal distribution by introducing a slice sampling based PBP algorithm. The proposed approach shows superior convergence performance on an image denoising toy example. Our findings are validated on a challenging relational 2D feature tracking application.
A low-density parity-check (LDPC) code is a linear block code described by a sparse parity-check matrix, which can be efficiently represented by a bipartite Tanner graph. The standard iterative decoding algorithm, known as belief propagation, passes messages along the edges of this Tanner graph. Density evolution is an efficient method to analyze the performance of the belief propagation decoding algorithm for a particular LDPC code ensemble, enabling the determination of a decoding threshold. The basic problem addressed in this work is how to optimize the Tanner graph so that the decoding threshold is as large as possible. We introduce a new code optimization technique which involves the search space range which can be thought of as minimizing randomness in differential evolution or limiting the search range in exhaustive search. This technique is applied to the design of good irregular LDPC codes and multiedge type LDPC codes.
142 - Zhao Li , Yixin Liu , Zhen Zhang 2020
Graph neural networks (GNNs) have emerged as effective approaches for graph analysis, especially in the scenario of semi-supervised learning. Despite its success, GNN often suffers from over-smoothing and over-fitting problems, which affects its perf ormance on node classification tasks. We analyze that an alternative method, the label propagation algorithm (LPA), avoids the aforementioned problems thus it is a promising choice for graph semi-supervised learning. Nevertheless, the intrinsic limitations of LPA on feature exploitation and relation modeling make propagating labels become less effective. To overcome these limitations, we introduce a novel framework for graph semi-supervised learning termed as Cyclic Label Propagation (CycProp for abbreviation), which integrates GNNs into the process of label propagation in a cyclic and mutually reinforcing manner to exploit the advantages of both GNNs and LPA. In particular, our proposed CycProp updates the node embeddings learned by GNN module with the augmented information by label propagation, while fine-tunes the weighted graph of label propagation with the help of node embedding in turn. After the model converges, reliably predicted labels and informative node embeddings are obtained with the LPA and GNN modules respectively. Extensive experiments on various real-world datasets are conducted, and the experimental results empirically demonstrate that the proposed CycProp model can achieve relatively significant gains over the state-of-the-art methods.
We propose a nonparametric generalization of belief propagation, Kernel Belief Propagation (KBP), for pairwise Markov random fields. Messages are represented as functions in a reproducing kernel Hilbert space (RKHS), and message updates are simple li near operations in the RKHS. KBP makes none of the assumptions commonly required in classical BP algorithms: the variables need not arise from a finite domain or a Gaussian distribution, nor must their relations take any particular parametric form. Rather, the relations between variables are represented implicitly, and are learned nonparametrically from training data. KBP has the advantage that it may be used on any domain where kernels are defined (Rd, strings, groups), even where explicit parametric models are not known, or closed form expressions for the BP updates do not exist. The computational cost of message updates in KBP is polynomial in the training data size. We also propose a constant time approximate message update procedure by representing messages using a small number of basis functions. In experiments, we apply KBP to image denoising, depth prediction from still images, and protein configuration prediction: KBP is faster than competing classical and nonparametric approaches (by orders of magnitude, in some cases), while providing significantly more accurate results.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا