No Arabic abstract
Bit retrieval is the problem of reconstructing a binary sequence from its periodic autocorrelation, with applications in cryptography and x-ray crystallography. After defining the problem, with and without noise, we describe and compare various algorithms for solving it. A geometrical constraint satisfaction algorithm, relaxed-reflect-reflect, is currently the best algorithm for noisy bit retrieval.
We consider the problem of scattering $n$ robots in a two dimensional continuous space. As this problem is impossible to solve in a deterministic manner, all solutions must be probabilistic. We investigate the amount of randomness (that is, the number of random bits used by the robots) that is required to achieve scattering. We first prove that $n log n$ random bits are necessary to scatter $n$ robots in any setting. Also, we give a sufficient condition for a scattering algorithm to be random bit optimal. As it turns out that previous solutions for scattering satisfy our condition, they are hence proved random bit optimal for the scattering problem. Then, we investigate the time complexity of scattering when strong multiplicity detection is not available. We prove that such algorithms cannot converge in constant time in the general case and in $o(log log n)$ rounds for random bits optimal scattering algorithms. However, we present a family of scattering algorithms that converge as fast as needed without using multiplicity detection. Also, we put forward a specific protocol of this family that is random bit optimal ($n log n$ random bits are used) and time optimal ($log log n$ rounds are used). This improves the time complexity of previous results in the same setting by a $log n$ factor. Aside from characterizing the random bit complexity of mobile robot scattering, our study also closes its time complexity gap with and without strong multiplicity detection (that is, $O(1)$ time complexity is only achievable when strong multiplicity detection is available, and it is possible to approach it as needed otherwise).
We revisit the complexity of online computation in the cell probe model. We consider a class of problems where we are first given a fixed pattern or vector $F$ of $n$ symbols and then one symbol arrives at a time in a stream. After each symbol has arrived we must output some function of $F$ and the $n$-length suffix of the arriving stream. Cell probe bounds of $Omega(deltalg{n}/w)$ have previously been shown for both convolution and Hamming distance in this setting, where $delta$ is the size of a symbol in bits and $winOmega(lg{n})$ is the cell size in bits. However, when $delta$ is a constant, as it is in many natural situations, these previous results no longer give us non-trivial bounds. We introduce a new lop-sided information transfer proof technique which enables us to prove meaningful lower bounds even for constant size input alphabets. We use our new framework to prove an amortised cell probe lower bound of $Omega(lg^2 n/(wcdot lg lg n))$ time per arriving bit for an online version of a well studied problem known as pattern matching with address errors. This is the first non-trivial cell probe lower bound for any online problem on bit streams that still holds when the cell sizes are large. We also show the same bound for online convolution conditioned on a new combinatorial conjecture related to Toeplitz matrices.
We initiate the study of computing (near-)optimal contracts in succinctly representable principal-agent settings. Here optimality means maximizing the principals expected payoff over all incentive-compatible contracts---known in economics as second-best solutions. We also study a natural relaxation to approximately incentive-compatible contracts. We focus on principal-agent settings with succinctly described (and exponentially large) outcome spaces. We show that the computational complexity of computing a near-optimal contract depends fundamentally on the number of agent actions. For settings with a constant number of actions, we present a fully polynomial-time approximation scheme (FPTAS) for the separation oracle of the dual of the problem of minimizing the principals payment to the agent, and use this subroutine to efficiently compute a delta-incentive-compatible (delta-IC) contract whose expected payoff matches or surpasses that of the optimal IC contract. With an arbitrary number of actions, we prove that the problem is hard to approximate within any constant c. This inapproximability result holds even for delta-IC contracts where delta is a sufficiently rapidly-decaying function of c. On the positive side, we show that simple linear delta-IC contracts with constant delta are sufficient to achieve a constant-factor approximation of the first-best (full-welfare-extracting) solution, and that such a contract can be computed in polynomial time.
We consider the {em vector partition problem}, where $n$ agents, each with a $d$-dimensional attribute vector, are to be partitioned into $p$ parts so as to minimize cost which is a given function on the sums of attribute vectors in each part. The problem has applications in a variety of areas including clustering, logistics and health care. We consider the complexity and parameterized complexity of the problem under various assumptions on the natural parameters $p,d,a,t$ of the problem where $a$ is the maximum absolute value of any attribute and $t$ is the number of agent types, and raise some of the many remaining open problems.
We investigate the parameterized complexity in $a$ and $b$ of determining whether a graph~$G$ has a subset of $a$ vertices and $b$ edges whose removal disconnects $G$, or disconnects two prescribed vertices $s, t in V(G)$.