No Arabic abstract
In this paper we introduce a design for an optical topological cluster state computer constructed exclusively from a single quantum component. Unlike previous efforts we eliminate the need for on demand, high fidelity photon sources and detectors and replace them with the same device utilised to create photon/photon entanglement. This introduces highly probabilistic elements into the optical architecture while maintaining complete specificity of the structure and operation for a large scale computer. Photons in this system are continually recycled back into the preparation network, allowing for a arbitrarily deep 3D cluster to be prepared using a comparatively small number of photonic qubits and consequently the elimination of high frequency, deterministic photon sources.
Performing efficient quantum computer tuneup and calibration is essential for growth in system complexity. In this work we explore the link between facilitating such capabilities and the underlying architecture of the physical hardware. We focus on the specific challenge of measuring (``mapping) spatially inhomogeneous quasi-static calibration errors using spectator qubits dedicated to the task of sensing and calibration. We introduce a novel architectural concept for such spectator qubits: arranging them spatially according to prescriptions from optimal 2D approximation theory. We show that this insight allows for efficient reconstruction of inhomogeneities in qubit calibration, focusing on the specific example of frequency errors which may arise from fabrication variances or ambient magnetic fields. Our results demonstrate that optimal interpolation techniques display near optimal error-scaling in cases where the measured characteristic (here the qubit frequency) varies smoothly, and we probe the limits of these benefits as a function of measurement uncertainty. For more complex spatial variations, we demonstrate that the NMQA formalism for adaptive measurement and noise filtering outperforms optimal interpolation techniques in isolation, and crucially, can be combined with insights from optimal interpolation theory to produce a general purpose protocol.
We introduce the concept of hypergraphs to describe quantum optical experiments with probabilistic multi-photon sources. Every hyperedge represents a correlated photon source, and every vertex stands for an optical output path. Such general graph description provides new insights for producing complex high-dimensional multi-photon quantum entangled states, which go beyond limitations imposed by pair creation via spontaneous parametric down-conversion. Furthermore, properties of hypergraphs can be investigated experimentally. For example, the NP-Complete problem of deciding whether a hypergraph has a perfect matchin can be answered by experimentally detecting multi-photon events in quantum experiments. By introducing complex weights in hypergraphs, we show a general many-particle quantum interference and manipulating entanglement in a pictorial way. Our work paves the path for the development of multi-photon high-dimensional state generation and might inspire new applications of quantum computations using hypergraph mappings.
An ideal source of entangled photon pairs combines the perfect symmetry of an atom with the convenient electrical trigger of light sources based on semiconductor quantum dots. We create a naturally symmetric quantum dot cascade that emits highly entangled photon pairs on demand. Our source consists of strain-free GaAs dots self-assembled on a triangular symmetric (111)A surface. The emitted photons strongly violate Bells inequality and reveal a fidelity to the Bell state as high as 86 (+-2) % without postselection. This result is an important step towards scalable quantum-communication applications with efficient sources.
A research frontier has emerged in scientific computation, wherein numerical error is regarded as a source of epistemic uncertainty that can be modelled. This raises several statistical challenges, including the design of statistical methods that enable the coherent propagation of probabilities through a (possibly deterministic) computational work-flow. This paper examines the case for probabilistic numerical methods in routine statistical computation. Our focus is on numerical integration, where a probabilistic integrator is equipped with a full distribution over its output that reflects the presence of an unknown numerical error. Our main technical contribution is to establish, for the first time, rates of posterior contraction for these methods. These show that probabilistic integrators can in principle enjoy the best of both worlds, leveraging the sampling efficiency of Monte Carlo methods whilst providing a principled route to assess the impact of numerical error on scientific conclusions. Several substantial applications are provided for illustration and critical evaluation, including examples from statistical modelling, computer graphics and a computer model for an oil reservoir.
A new approach to efficient quantum computation with probabilistic gates is proposed and analyzed in both a local and non-local setting. It combines heralded gates previously studied for atom or atom-like qubits with logical encoding from linear optical quantum computation in order to perform high fidelity quantum gates across a quantum network. The error-detecting properties of the heralded operations ensure high fidelity while the encoding makes it possible to correct for failed attempts such that deterministic and high-quality gates can be achieved. Importantly, this is robust to photon loss, which is typically the main obstacle to photonic based quantum information processing. Overall this approach opens a novel path towards quantum networks with atomic nodes and photonic links.