ﻻ يوجد ملخص باللغة العربية
The hopes for scalable quantum computing rely on the threshold theorem: once the error per qubit per gate is below a certain value, the methods of quantum error correction allow indefinitely long quantum computations. The proof is based on a number of assumptions, which are supposed to be satisfied exactly, like axioms, e.g. zero undesired interactions between qubits, etc. However in the physical world no continuous quantity can be exactly zero, it can only be more or less small. Thus the error per qubit per gate threshold must be complemented by the required precision with which each assumption should be fulfilled. This issue was never addressed. In the absence of this crucial information, the prospects of scalable quantum computing remain uncertain.
In topological quantum computation, quantum information is stored in states which are intrinsically protected from decoherence, and quantum gates are carried out by dragging particle-like excitations (quasiparticles) around one another in two space d
The primary resource for quantum computation is Hilbert-space dimension. Whereas Hilbert space itself is an abstract construction, the number of dimensions available to a system is a physical quantity that requires physical resources. Avoiding a dema
We introduce the concept of embedding quantum simulators, a paradigm allowing the efficient quantum computation of a class of bipartite and multipartite entanglement monotones. It consists in the suitable encoding of a simulated quantum dynamics in t
Digital quantum computing paradigm offers highly-desirable features such as universality, scalability, and quantum error correction. However, physical resource requirements to implement useful error-corrected quantum algorithms are prohibitive in the
Several proposals for quantum computation utilize a lattice type architecture with qubits trapped by a periodic potential. For systems undergoing many body interactions described by the Bose-Hubbard Hamiltonian, the ground state of the system carries