Do you want to publish a course? Click here

Random Rank-Based, Hierarchical or Trivial: Which Dynamic Graph Algorithm Performs Best in Practice?

57   0   0.0 ( 0 )
 Added by Alexander Noe
 Publication date 2021
and research's language is English




Ask ChatGPT about the research

Fully dynamic graph algorithms that achieve polylogarithmic or better time per operation use either a hierarchical graph decomposition or random-rank based approach. There are so far two graph properties for which efficient algorithms for both types of data structures exist, namely fully dynamic (Delta + 1) coloring and fully dynamic maximal matching. In this paper we present an extensive experimental study of these two types of algorithms for these two problems together with very simple baseline algorithms to determine which of these algorithms are the fastest. Our results indicate that the data structures used by the different algorithms dominate their performance.



rate research

Read More

Digital contact tracing of an infected person, testing the possible infection for the contacted persons, and isolation play a crucial role in alleviating the outbreak. Here, we design a dynamic graph streaming algorithm that can trace the contacts under the control of the Public Health Authorities (PHA). Our algorithm receives proximity data from the mobile devices as contact data streams and uses a sliding window model to construct a dynamic contact graph sketch. Prominently, we introduce the edge label of the contact graph as a binary contact vector, which acts like a sliding window and holds the latest D days (incubation period) of temporal social interactions. Notably, the algorithm prepares the direct and indirect (multilevel) contact list from the contact graph sketch for a given set of infected persons. Finally, the algorithm also uses a disjoint set data structure to construct the infection pathways for the trace list. The present study offers the design of algorithms with underlying data structures for digital contact trace relevant to the proximity data produced by Bluetooth enabled mobile devices. Our analysis reveals that for COVID-19 close contact parameters, the storage space requires maintaining the contact graph of ten million users; having 14 days of close contact data in the PHA server takes 55 Gigabytes of memory and preparation of the contact list for a given set of the infected person depends on the size of the infected list. Our centralized digital contact tracing framework can also be applicable for other relevant diseases parameterized by an incubation period and proximity duration of contacts.
The fully dynamic transitive closure problem asks to maintain reachability information in a directed graph between arbitrary pairs of vertices, while the graph undergoes a sequence of edge insertions and deletions. The problem has been thoroughly investigated in theory and many specialized algorithms for solving it have been proposed in the last decades. In two large studies [Frigioni ea, 2001; Krommidas and Zaroliagis, 2008], a number of these algorithms have been evaluated experimentally against simple static algorithms for graph traversal, showing the competitiveness and even superiority of the simple algorithms in practice, except for very dense random graphs or very high ratios of queries. A major drawback of those studies is that only small and mostly randomly generated graphs are considered. In this paper, we engineer new algorithms to maintain all-pairs reachability information which are simple and space-efficient. Moreover, we perform an extensive experimental evaluation on both generated and real-world instances that are several orders of magnitude larger than those in the previous studies. Our results indicate that our new algorithms outperform all state-of-the-art algorithms on all types of input considerably in practice.
There is an ongoing debate in computer science how algorithms should best be studied. Some scholars have argued that experimental evaluations should be conducted, others emphasize the benefits of formal analysis. We believe that this debate less of a question of either-or, because both views can be integrated into an overarching framework. It is the ambition of this paper to develop such a framework of algorithm engineering with a theoretical foundation in the philosophy of science. We take the empirical nature of algorithm engineering as a starting point. Our theoretical framework builds on three areas discussed in the philosophy of science: ontology, epistemology and methodology. In essence, ontology describes algorithm engineering as being concerned with algorithmic problems, algorithmic tasks, algorithm designs and algorithm implementations. Epistemology describes the body of knowledge of algorithm engineering as a collection of prescriptive and descriptive knowledge, residing in World 3 of Poppers Three Worlds model. Methodology refers to the steps how we can systematically enhance our knowledge of specific algorithms. In this context, we identified seven validity concerns and discuss how researchers can respond to falsification. Our framework has important implications for researching algorithms in various areas of computer science.
Given a directed graph and a source vertex, the fully dynamic single-source reachability problem is to maintain the set of vertices that are reachable from the given vertex, subject to edge deletions and insertions. It is one of the most fundamental problems on graphs and appears directly or indirectly in many and varied applications. While there has been theoretical work on this problem, showing both linear conditional lower bounds for the fully dynamic problem and insertions-only and deletions-only upper bounds beating these conditional lower bounds, there has been no experimental study that compares the performance of fully dynamic reachability algorithms in practice. Previous experimental studies in this area concentrated only on the more general all-pairs reachability or transitive closure problem and did not use real-world dynamic graphs. In this paper, we bridge this gap by empirically studying an extensive set of algorithms for the single-source reachability problem in the fully dynamic setting. In particular, we design several fully dynamic variants of well-known approaches to obtain and maintain reachability information with respect to a distinguished source. Moreover, we extend the existing insertions-only or deletions-only upper bounds into fully dynamic algorithms. Even though the worst-case time per operation of all the fully dynamic algorithms we evaluate is at least linear in the number of edges in the graph (as is to be expected given the conditional lower bounds) we show in our extensive experimental evaluation that their performance differs greatly, both on generated as well as on real-world instances.
In this paper we study the number of vertex recolorings that an algorithm needs to perform in order to maintain a proper coloring of a graph under insertion and deletion of vertices and edges. We present two algorithms that achieve different trade-offs between the number of recolorings and the number of colors used. For any $d>0$, the first algorithm maintains a proper $O(mathcal{C} d N^{1/d})$-coloring while recoloring at most $O(d)$ vertices per update, where $mathcal{C}$ and $N$ are the maximum chromatic number and maximum number of vertices, respectively. The second algorithm reverses the trade-off, maintaining an $O(mathcal{C} d)$-coloring with $O(d N^{1/d})$ recolorings per update. The two converge when $d = log N$, maintaining an $O(mathcal{C} log N)$-coloring with $O(log N)$ recolorings per update. We also present a lower bound, showing that any algorithm that maintains a $c$-coloring of a $2$-colorable graph on $N$ vertices must recolor at least $Omega(N^frac{2}{c(c-1)})$ vertices per update, for any constant $c geq 2$.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا