No Arabic abstract
Many graph problems are locally checkable: a solution is globally feasible if it looks valid in all constant-radius neighborhoods. This idea is formalized in the concept of locally checkable labelings (LCLs), introduced by Naor and Stockmeyer (1995). Recently, Chang et al. (2016) showed that in bounded-degree graphs, every LCL problem belongs to one of the following classes: - Easy: solvable in $O(log^* n)$ rounds with both deterministic and randomized distributed algorithms. - Hard: requires at least $Omega(log n)$ rounds with deterministic and $Omega(log log n)$ rounds with randomized distributed algorithms. Hence for any parameterized LCL problem, when we move from local problems towards global problems, there is some point at which complexity suddenly jumps from easy to hard. For example, for vertex coloring in $d$-regular graphs it is now known that this jump is at precisely $d$ colors: coloring with $d+1$ colors is easy, while coloring with $d$ colors is hard. However, it is currently poorly understood where this jump takes place when one looks at defective colorings. To study this question, we define $k$-partial $c$-coloring as follows: nodes are labeled with numbers between $1$ and $c$, and every node is incident to at least $k$ properly colored edges. It is known that $1$-partial $2$-coloring (a.k.a. weak $2$-coloring) is easy for any $d ge 1$. As our main result, we show that $k$-partial $2$-coloring becomes hard as soon as $k ge 2$, no matter how large a $d$ we have. We also show that this is fundamentally different from $k$-partial $3$-coloring: no matter which $k ge 3$ we choose, the problem is always hard for $d = k$ but it becomes easy when $d gg k$. The same was known previously for partial $c$-coloring with $c ge 4$, but the case of $c < 4$ was open.
Nonlocality plays a fundamental role in quantum information science. Recently, it has been theoretically predicted and experimentally demonstrated that the nonlocality of an entangled pair may be shared among multiple observers using weak measurements with moderate strength. Here we devise an optimal protocol of nonlocality sharing among three observers and show experimentally that nonlocality sharing may be also achieved using weak measurements with near-maximum strength. Our result sheds light on the interplay between nonlocality and quantum measurements and, may find applications in quantum steering, unbounded randomness certification and quantum communication network.
Golovach, Paulusma and Song (Inf. Comput. 2014) asked to determine the parameterized complexity of the following problems parameterized by $k$: (1) Given a graph $G$, a clique modulator $D$ (a clique modulator is a set of vertices, whose removal results in a clique) of size $k$ for $G$, and a list $L(v)$ of colors for every $vin V(G)$, decide whether $G$ has a proper list coloring; (2) Given a graph $G$, a clique modulator $D$ of size $k$ for $G$, and a pre-coloring $lambda_P: X rightarrow Q$ for $X subseteq V(G),$ decide whether $lambda_P$ can be extended to a proper coloring of $G$ using only colors from $Q.$ For Problem 1 we design an $O^*(2^k)$-time randomized algorithm and for Problem 2 we obtain a kernel with at most $3k$ vertices. Banik et al. (IWOCA 2019) proved the the following problem is fixed-parameter tractable and asked whether it admits a polynomial kernel: Given a graph $G$, an integer $k$, and a list $L(v)$ of exactly $n-k$ colors for every $v in V(G),$ decide whether there is a proper list coloring for $G.$ We obtain a kernel with $O(k^2)$ vertices and colors and a compression to a variation of the problem with $O(k)$ vertices and $O(k^2)$ colors.
Cloud Computing (CC) is a model for enabling on-demand access to a shared pool of configurable computing resources. Testing and evaluating the performance of the cloud environment for allocating, provisioning, scheduling, and data allocation policy have great attention to be achieved. Therefore, using cloud simulator would save time and money, and provide a flexible environment to evaluate new research work. Unfortunately, the current simulators (e.g., CloudSim, NetworkCloudSim, GreenCloud, etc..) deal with the data as for size only without any consideration about the data allocation policy and locality. On the other hand, the NetworkCloudSim simulator is considered one of the most common used simulators because it includes different modules which support needed functions to a simulated cloud environment, and it could be extended to include new extra modules. According to work in this paper, the NetworkCloudSim simulator has been extended and modified to support data locality. The modified simulator is called LocalitySim. The accuracy of the proposed LocalitySim simulator has been proved by building a mathematical model. Also, the proposed simulator has been used to test the performance of the three-tire data center as a case study with considering the data locality feature.
The textit{$k$-weak-dynamic number} of a graph $G$ is the smallest number of colors we need to color the vertices of $G$ in such a way that each vertex $v$ of degree $d(v)$ sees at least $rm{min}{k,d(v)}$ colors on its neighborhood. We use reducible configurations and list coloring of graphs to prove that all planar graphs have 3-weak-dynamic number at most 6.
SN 2001em is a peculiar supernova, originally classified as Type Ib/c. About two years after the SN it was detected in the radio, showing a rising radio flux with an optically thin spectral slope, and it also displayed a large X-ray luminosity (~10^{41} erg/s). Thus it was suspected to harbor a decelerating (by then, mildly) relativistic jet pointing away from us. About 3 years after its discovery the optical spectrum of SN 2001em showed a broad H-alpha line, and it was therefore reclassified as Type IIn. Here we constrain its proper motion and expansion velocity by analyzing four epochs of VLBI observations, extending out to 5.4 years after the SN. The supernova is still unresolved 5.4 years after the explosion. For the proper motion we obtain (23,000 +/- 30,000) km/s while our 2-sigma upper limit on the expansion velocity is 6000 km/s. These limits are somewhat tighter than those derived by Bietenholz & Bartel, and confirm their conclusion that late time emission from SN 2001em, a few years after the explosion, is not driven by a relativistic jet. VLA observations of the radio flux density, at 8.46 GHz, show a decay as t^{-1.23 +/- 0.40} starting ~2.7 years after the SN. Collectively, the observations suggest interaction of the SN ejecta with a very dense circumstellar medium, though the implied opacity constraints still present a challenge.