ترغب بنشر مسار تعليمي؟ اضغط هنا

Revisiting the hopes for scalable quantum computation

210   0   0.0 ( 0 )
 نشر من قبل M. I. Dyakonov
 تاريخ النشر 2012
  مجال البحث فيزياء
والبحث باللغة English
 تأليف M. I. Dyakonov




اسأل ChatGPT حول البحث

The hopes for scalable quantum computing rely on the threshold theorem: once the error per qubit per gate is below a certain value, the methods of quantum error correction allow indefinitely long quantum computations. The proof is based on a number of assumptions, which are supposed to be satisfied exactly, like axioms, e.g. zero undesired interactions between qubits, etc. However in the physical world no continuous quantity can be exactly zero, it can only be more or less small. Thus the error per qubit per gate threshold must be complemented by the required precision with which each assumption should be fulfilled. This issue was never addressed. In the absence of this crucial information, the prospects of scalable quantum computing remain uncertain.

قيم البحث

اقرأ أيضاً

In topological quantum computation, quantum information is stored in states which are intrinsically protected from decoherence, and quantum gates are carried out by dragging particle-like excitations (quasiparticles) around one another in two space d imensions. The resulting quasiparticle trajectories define world-lines in three dimensional space-time, and the corresponding quantum gates depend only on the topology of the braids formed by these world-lines. We show how to find braids that yield a universal set of quantum gates for qubits encoded using a specific kind of quasiparticle which is particularly promising for experimental realization.
The primary resource for quantum computation is Hilbert-space dimension. Whereas Hilbert space itself is an abstract construction, the number of dimensions available to a system is a physical quantity that requires physical resources. Avoiding a dema nd for an exponential amount of these resources places a fundamental constraint on the systems that are suitable for scalable quantum computation. To be scalable, the number of degrees of freedom in the computer must grow nearly linearly with the number of qubits in an equivalent qubit-based quantum computer.
We introduce the concept of embedding quantum simulators, a paradigm allowing the efficient quantum computation of a class of bipartite and multipartite entanglement monotones. It consists in the suitable encoding of a simulated quantum dynamics in t he enlarged Hilbert space of an embedding quantum simulator. In this manner, entanglement monotones are conveniently mapped onto physical observables, overcoming the necessity of full tomography and reducing drastically the experimental requirements. Furthermore, this method is directly applicable to pure states and, assisted by classical algorithms, to the mixed-state case. Finally, we expect that the proposed embedding framework paves the way for a general theory of enhanced one-to-one quantum simulators.
Digital quantum computing paradigm offers highly-desirable features such as universality, scalability, and quantum error correction. However, physical resource requirements to implement useful error-corrected quantum algorithms are prohibitive in the current era of NISQ devices. As an alternative path to performing universal quantum computation, within the NISQ era limitations, we propose to merge digital single-qubit operations with analog multi-qubit entangling blocks in an approach we call digital-analog quantum computing (DAQC). Along these lines, although the techniques may be extended to any resource, we propose to use unitaries generated by the ubiquitous Ising Hamiltonian for the analog entangling block and we prove its universal character. We construct explicit DAQC protocols for efficient simulations of arbitrary inhomogeneous Ising, two-body, and $M$-body spin Hamiltonian dynamics by means of single-qubit gates and a fixed homogeneous Ising Hamiltonian. Additionally, we compare a sequential approach where the interactions are switched on and off (stepwise~DAQC) with an always-on multi-qubit interaction interspersed by fast single-qubit pulses (banged DAQC). Finally, we perform numerical tests comparing purely digital schemes with DAQC protocols, showing a remarkably better performance of the latter. The proposed DAQC approach combines the robustness of analog quantum computing with the flexibility of digital methods.
Several proposals for quantum computation utilize a lattice type architecture with qubits trapped by a periodic potential. For systems undergoing many body interactions described by the Bose-Hubbard Hamiltonian, the ground state of the system carries number fluctuations that scale with the number of qubits. This process degrades the initialization of the quantum computer register and can introduce errors during error correction. In an earlier manuscript we proposed a solution to this problem tailored to the loading of cold atoms into an optical lattice via the Mott Insulator phase transition. It was shown that by adding an inhomogeneity to the lattice and performing a continuous measurement, the unit filled state suitable for a quantum computer register can be maintained. Here, we give a more rigorous derivation of the register fidelity in homogeneous and inhomogeneous lattices and provide evidence that the protocol is effective in the finite temperature regime.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا