ﻻ يوجد ملخص باللغة العربية
In this paper we show that approximation can help reduce the space used for self-stabilization. In the classic emph{state model}, where the nodes of a network communicate by reading the states of their neighbors, an important measure of efficiency is the space: the number of bits used at each node to encode the state. In this model, a classic requirement is that the algorithm has to be emph{silent}, that is, after stabilization the states should not change anymore. We design a silent self-stabilizing algorithm for the problem of minimum spanning tree, that has a trade-off between the quality of the solution and the space needed to compute it.
Imagine a large graph that is being processed by a cluster of computers, e.g., described by the $k$-machine model or the Massively Parallel Computation Model. The graph, however, is not static; instead it is receiving a constant stream of updates. Ho
Diameter, radius and eccentricities are fundamental graph parameters, which are extensively studied in various computational settings. Typically, computing approximate answers can be much more efficient compared with computing exact solutions. In thi
We introduce the Adaptive Massively Parallel Computation (AMPC) model, which is an extension of the Massively Parallel Computation (MPC) model. At a high level, the AMPC model strengthens the MPC model by storing all messages sent within a round in a
We study the cost of distributed MST construction in the setting where each edge has a latency and a capacity, along with the weight. Edge latencies capture the delay on the links of the communication network, while capacity captures their throughput
The Minimum Dominating Set (MDS) problem is not only one of the most fundamental problems in distributed computing, it is also one of the most challenging ones. While it is well-known that minimum dominating sets cannot be approximated locally on gen