We develop the theory of algorithmic randomness for the space $A^G$ where $A$ is a finite alphabet and $G$ is a computable amenable group. We give an effective version of the Shannon-McMillan-Breiman theorem in this setting. We also extend a result o
f Simpson equating topological entropy and Hausdorff dimension. This proof makes use of work of Ornstein and Weiss which we also present.
The objective of this study is a better understanding of the relationships between reduction and continuity. Solovay reduction is a variation of Turing reduction based on the distance of two real numbers. We characterize Solovay reduction by the exis
tence of a certain real function that is computable (in the sense of computable analysis) and Lipschitz continuous. We ask whether there exists a reducibility concept that corresponds to Holder continuity. The answer is affirmative. We introduce quasi Solovay reduction and characterize this new reduction via Holder continuity. In addition, we separate it from Solovay reduction and Turing reduction and investigate the relationships between complete sets and partial randomness.
We describe a new data structure for dynamic nearest neighbor queries in the plane with respect to a general family of distance functions. These include $L_p$-norms and additively weighted Euclidean distances. Our data structure supports general (con
vex, pairwise disjoint) sites that have constant description complexity (e.g., points, line segments, disks, etc.). Our structure uses $O(n log^3 n)$ storage, and requires polylogarithmic update and query time, improving an earlier data structure of Agarwal, Efrat and Sharir that required $O(n^varepsilon)$ time for an update and $O(log n)$ time for a query [SICOMP, 1999]. Our data structure has numerous applications. In all of them, it gives faster algorithms, typically reducing an $O(n^varepsilon)$ factor in the previous bounds to polylogarithmic. In addition, we give here two new applications: an efficient construction of a spanner in a disk intersection graph, and a data structure for efficient connectivity queries in a dynamic disk graph.
The advantages of quantum random number generators (QRNGs) over pseudo-random number generators (PRNGs) are normally attributed to the nature of quantum measurements. This is often seen as implying the superiority of the sequences of bits themselves
generated by QRNGs, despite the absence of empirical tests supporting this. Nonetheless, one may expect sequences of bits generated by QRNGs to have properties that pseudo-random sequences do not; indeed, pseudo-random sequences are necessarily computable, a highly nontypical property of sequences. In this paper, we discuss the differences between QRNGs and PRNGs and the challenges involved in certifying the quality of QRNGs theoretically and testing their output experimentally. While QRNGs are often tested with standard suites of statistical tests, such tests are designed for PRNGs and only verify statistical properties of a QRNG, but are insensitive to many supposed advantages of QRNGs. We discuss the ability to test the incomputability and algorithmic complexity of QRNGs. While such properties cannot be directly verified with certainty, we show how one can construct indirect tests that may provide evidence for the incomputability of QRNGs. We use these tests to compare various PRNGs to a QRNG, based on superconducting transmon qutrits and certified by the Kochen-Specker Theorem, to see whether such evidence can be found in practice. While our tests fail to observe a strong advantage of the quantum random sequences due to algorithmic properties, the results are nonetheless informative: some of the test results are ambiguous and require further study, while others highlight difficulties that can guide the development of future tests of algorithmic randomness and incomputability.