ﻻ يوجد ملخص باللغة العربية
We consider the randomized decision tree complexity of the recursive 3-majority function. We prove a lower bound of $(1/2-delta) cdot 2.57143^h$ for the two-sided-error randomized decision tree complexity of evaluating height $h$ formulae with error $delta in [0,1/2)$. This improves the lower bound of $(1-2delta)(7/3)^h$ given by Jayram, Kumar, and Sivakumar (STOC03), and the one of $(1-2delta) cdot 2.55^h$ given by Leonardos (ICALP13). Second, we improve the upper bound by giving a new zero-error randomized decision tree algorithm that has complexity at most $(1.007) cdot 2.64944^h$. The previous best known algorithm achieved complexity $(1.004) cdot 2.65622^h$. The new lower bound follows from a better analysis of the base case of the recursion of Jayram et al. The new algorithm uses a novel interleaving of two recursive algorithms.
An assignment of colours to the vertices of a graph is stable if any two vertices of the same colour have identically coloured neighbourhoods. The goal of colour refinement is to find a stable colouring that uses a minimum number of colours. This is
Given $n$ colored balls, we want to detect if more than $lfloor n/2rfloor$ of them have the same color, and if so find one ball with such majority color. We are only allowed to choose two balls and compare their colors, and the goal is to minimize th
We investigate the parameterized complexity in $a$ and $b$ of determining whether a graph~$G$ has a subset of $a$ vertices and $b$ edges whose removal disconnects $G$, or disconnects two prescribed vertices $s, t in V(G)$.
This paper initiates the study of I/O algorithms (minimizing cache misses) from the perspective of fine-grained complexity (conditional polynomial lower bounds). Specifically, we aim to answer why sparse graph problems are so hard, and why the Longes
We give tight cell-probe bounds for the time to compute convolution, multiplication and Hamming distance in a stream. The cell probe model is a particularly strong computational model and subsumes, for example, the popular word RAM model. We first