Do you want to publish a course? Click here

Deterministic Randomness Extraction from Generalized and Distributed Santha-Vazirani Sources

171   0   0.0 ( 0 )
 Added by Omid Etesami
 Publication date 2014
and research's language is English




Ask ChatGPT about the research

A Santha-Vazirani (SV) source is a sequence of random bits where the conditional distribution of each bit, given the previous bits, can be partially controlled by an adversary. Santha and Vazirani show that deterministic randomness extraction from these sources is impossible. In this paper, we study the generalization of SV sources for non-binary sequences. We show that unlike the binary case, deterministic randomness extraction in the generalized case is sometimes possible. We present a necessary condition and a sufficient condition for the possibility of deterministic randomness extraction. These two conditions coincide in non-degenerate cases. Next, we turn to a distributed setting. In this setting the SV source consists of a random sequence of pairs $(a_1, b_1), (a_2, b_2), ldots$ distributed between two parties, where the first party receives $a_i$s and the second one receives $b_i$s. The goal of the two parties is to extract common randomness without communication. Using the notion of maximal correlation, we prove a necessary condition and a sufficient condition for the possibility of common randomness extraction from these sources. Based on these two conditions, the problem of common randomness extraction essentially reduces to the problem of randomness extraction from (non-distributed) SV sources. This result generalizes results of Gacs and Korner, and Witsenhausen about common randomness extraction from i.i.d. sources to adversarial sources.



rate research

Read More

The notion of semi-random sources, also known as Santha-Vazirani (SV) sources, stands for a sequence of n bits, where the dependence of the ith bit on the previous i-1 bits is limited for every $iin[n]$. If the dependence of the ith bit on the remaining n-1 bits is limited, then this is a strong SV-source. Even the strong SV-sources are known not to admit (universal) deterministic extractors, but they have seeded extractors, as their min-entropy is $Omega(n)$. It is intuitively obvious that strong SV-sources are more than just high-min-entropy sources, and this work explores the intuition. Deterministic condensers are known not to exist for general high-min-entropy sources, and we construct for any constants $epsilon, delta in (0,1)$ a deterministic condenser that maps n bits coming from a strong SV-source with bias at most $delta$ to $Omega(n)$ bits of min-entropy rate at least $1-epsilon$. In conclusion we observe that deterministic condensers are closely related to very strong extractors - a proposed strengthening of the notion of strong (seeded) extractors: in particular, our constructions can be viewed as very strong extractors for the family of strong Santha-Vazirani distributions. The notion of very strong extractors requires that the output remains unpredictable even to someone who knows not only the seed value (as in the case of strong extractors), but also the extractors outputs corresponding to the same input value with each of the preceding seed values (say, under the lexicographic ordering). Very strong extractors closely resemble the original notion of SV-sources, except that the bits must satisfy the unpredictability requirement only on average.
The communication complexity of many fundamental problems reduces greatly when the communicating parties share randomness that is independent of the inputs to the communication task. Natural communication processes (say between humans) however often involve large amounts of shared correlations among the communicating players, but rarely allow for perfect sharing of randomness. Can the communication complexity benefit from shared correlations as well as it does from shared randomness? This question was considered mainly in the context of simultaneous communication by Bavarian et al. (ICALP 2014). In this work we study this problem in the standard interactive setting and give some general results. In particular, we show that every problem with communication complexity of $k$ bits with perfectly shared randomness has a protocol using imperfectly shared randomness with complexity $exp(k)$ bits. We also show that this is best possible by exhibiting a promise problem with complexity $k$ bits with perfectly shared randomness which requires $exp(k)$ bits when the randomness is imperfectly shared. Along the way we also highlight some other basic problems such as compression, and agreement distillation, where shared randomness plays a central role and analyze the complexity of these problems in the imperfectly shared randomness model. The technical highlight of this work is the lower bound that goes into the result showing the tightness of our general connection. This result builds on the intuition that communication with imperfectly shared randomness needs to be less sensitive to its random inputs than communication with perfectly shared randomness. The formal proof invokes results about the small-set expansion of the noisy hypercube and an invariance principle to convert this intuition to a proof, thus giving a new application domain for these fundamental results.
Two parties wish to carry out certain distributed computational tasks, and they are given access to a source of correlated random bits. It allows the parties to act in a correlated manner, which can be quite useful. But what happens if the shared randomness is not perfect? In this work, we initiate the study of the power of different sources of shared randomness in communication complexity. This is done in the setting of simultaneous message passing (SMP) model of communication complexity, which is one of the most suitable models for studying the resource of shared randomness. Toward characterising the power of various sources of shared randomness, we introduce a measure for the quality of a source - we call it collision complexity. Our results show that the collision complexity tightly characterises the power of a (shared) randomness resource in the SMP model. Of independent interest is our demonstration that even the weakest sources of shared randomness can in some cases increase the power of SMP substantially: the equality function can be solved very efficiently with virtually any nontrivial shared randomness.
Algorithmic randomness theory starts with a notion of an individual random object. To be reasonable, this notion should have some natural properties; in particular, an object should be random with respect to image distribution if and only if it has a random preimage. This result (for computable distributions and mappings, and Martin-Lof randomness) was known for a long time (folklore); in this paper we prove its natural generalization for layerwise computable mappings, and discuss the related quantitative results.
Do completely unpredictable events exist in nature? Classical theory, being fully deterministic, completely excludes fundamental randomness. On the contrary, quantum theory allows for randomness within its axiomatic structure. Yet, the fact that a theory makes prediction only in probabilistic terms does not imply the existence of any form of randomness in nature. The question then remains whether one can certify randomness independent of the physical framework used. While standard Bell tests approach this question from this perspective, they require prior perfect randomness, which renders the approach circular. Recently, it has been shown that it is possible to certify full randomness using almost perfect random bits. Here, we prove that full randomness can indeed be certified using quantum non-locality under the minimal possible assumptions: the existence of a source of arbitrarily weak (but non-zero) randomness and the impossibility of instantaneous signalling. Thus we are left with a strict dichotomic choice: either our world is fully deterministic or there exist in nature events that are fully random. Apart from the foundational implications, our results represent a quantum protocol for full randomness amplification, an information task known to be impossible classically. Finally, they open a new path for device-independent protocols under minimal assumptions.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا