ترغب بنشر مسار تعليمي؟ اضغط هنا

On the Similarity between von Neumann Graph Entropy and Structural Information: Interpretation, Computation, and Applications

369   0   0.0 ( 0 )
 نشر من قبل Xuecheng Liu
 تاريخ النشر 2021
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

The von Neumann graph entropy is a measure of graph complexity based on the Laplacian spectrum. It has recently found applications in various learning tasks driven by networked data. However, it is computational demanding and hard to interpret using simple structural patterns. Due to the close relation between Lapalcian spectrum and degree sequence, we conjecture that the structural information, defined as the Shannon entropy of the normalized degree sequence, might be a good approximation of the von Neumann graph entropy that is both scalable and interpretable. In this work, we thereby study the difference between the structural information and von Neumann graph entropy named as {em entropy gap}. Based on the knowledge that the degree sequence is majorized by the Laplacian spectrum, we for the first time prove the entropy gap is between $0$ and $log_2 e$ in any undirected unweighted graphs. Consequently we certify that the structural information is a good approximation of the von Neumann graph entropy that achieves provable accuracy, scalability, and interpretability simultaneously. This approximation is further applied to two entropy-related tasks: network design and graph similarity measure, where novel graph similarity measure and fast algorithms are proposed. Our experimental results on graphs of various scales and types show that the very small entropy gap readily applies to a wide range of graphs and weighted graphs. As an approximation of the von Neumann graph entropy, the structural information is the only one that achieves both high efficiency and high accuracy among the prominent methods. It is at least two orders of magnitude faster than SLaQ with comparable accuracy. Our structural information based methods also exhibit superior performance in two entropy-related tasks.


قيم البحث

اقرأ أيضاً

We conjecture that all connected graphs of order $n$ have von Neumann entropy at least as great as the star $K_{1,n-1}$ and prove this for almost all graphs of order $n$. We show that connected graphs of order $n$ have Renyi 2-entropy at least as gre at as $K_{1,n-1}$ and for $alpha>1$, $K_n$ maximizes Renyi $alpha$-entropy over graphs of order $n$. We show that adding an edge to a graph can lower its von Neumann entropy.
How do social networks differ across platforms? How do information networks change over time? Answering questions like these requires us to compare two or more graphs. This task is commonly treated as a measurement problem, but numerical answers give limited insight. Here, we argue that if the goal is to gain understanding, we should treat graph similarity assessment as a description problem instead. We formalize this problem as a model selection task using the Minimum Description Length principle, capturing the similarity of the input graphs in a common model and the differences between them in transformations to individual models. To discover good models, we propose Momo, which breaks the problem into two parts and introduces efficient algorithms for each. Through an extensive set of experiments on a wide range of synthetic and real-world graphs, we confirm that Momo works well in practice.
198 - Romain Duboscq 2019
We consider in this work the problem of minimizing the von Neumann entropy under the constraints that the density of particles, the current, and the kinetic energy of the system is fixed at each point of space. The unique minimizer is a self-adjoint positive trace class operator, and our objective is to characterize its form. We will show that this minimizer is solution to a self-consistent nonlinear eigenvalue problem. One of the main difficulties in the proof is to parametrize the feasible set in order to derive the Euler-Lagrange equation, and we will proceed by constructing an appropriate form of perturbations of the minimizer. The question of deriving quantum statistical equilibria is at the heart of the quantum hydrody-namical models introduced by Degond and Ringhofer in [5]. An original feature of the problem is the local nature of constraints, i.e. they depend on position, while more classical models consider the total number of particles, the total current and the total energy in the system to be fixed.
Given an ensemble of systems in an unknown state, as well as an observable $hat A$ and a physical apparatus which performs a measurement of $hat A$ on the ensemble, whose detailed working is unknown (black box), how can one test whether the Luders or von Neumann reduction rule applies?
The von Neumann graph entropy (VNGE) can be used as a measure of graph complexity, which can be the measure of information divergence and distance between graphs. However, computing VNGE is extensively demanding for a large-scale graph. We propose no vel quadratic approximations for fast computing VNGE. Various inequalities for error between the quadratic approximations and the exact VNGE are found. Our methods reduce the cubic complexity of VNGE to linear complexity. Computational simulations on random graph models and various real network datasets demonstrate superior performance.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا