Do you want to publish a course? Click here

Sufficient condition on noise correlations for scalable quantum computing

109   0   0.0 ( 0 )
 Added by John Preskill
 Publication date 2012
  fields Physics
and research's language is English
 Authors John Preskill




Ask ChatGPT about the research

I study the effectiveness of fault-tolerant quantum computation against correlated Hamiltonian noise, and derive a sufficient condition for scalability. Arbitrarily long quantum computations can be executed reliably provided that noise terms acting collectively on k system qubits are sufficiently weak, and decay sufficiently rapidly with increasing k and with increasing spatial separation of the qubits.



rate research

Read More

We propose a novel scheme of solid state realization of a quantum computer based on single spin enhancement mode quantum dots as building blocks. In the enhancement quantum dots, just one electron can be brought into initially empty dot, in contrast to depletion mode dots based on expelling of electrons from multi-electron dots by gates. The quantum computer architectures based on depletion dots are confronted by several challenges making scalability difficult. These challenges can be successfully met by the approach based on ehnancement mode, capable of producing square array of dots with versatile functionalities. These functionalities allow transportation of qubits, including teleportation, and error correction based on straightforward one- and two-qubit operations. We describe physical properties and demonstrate experimental characteristics of enhancement quantum dots and single-electron transistors based on InAs/GaSb composite quantum wells. We discuss the materials aspects of quantum dot quantum computing, including the materials with large spin splitting such as InAs, as well as perspectives of enhancement mode approach in materials such as Si.
We solve the problem of whether a set of quantum tests reveals state-independent contextuality and use this result to identify the simplest set of the minimal dimension. We also show that identifying state-independent contextuality graphs [R. Ramanathan and P. Horodecki, Phys. Rev. Lett. 112, 040404 (2014)] is not sufficient for revealing state-independent contextuality.
We show a significant reduction of the number of quantum operations and the improvement of the circuit depth for the realization of the Toffoli gate by using qudits. This is done by establishing a general relation between the dimensionality of qudits and their topology of connections for a scalable multi-qudit processor, where higher qudit levels are used for substituting ancillas. The suggested model is of importance for the realization of quantum algorithms and as a method of quantum error correction codes for single-qubit operations.
In order to analyze joint measurability of given measurements, we introduce a Hermitian operator-valued measure, called $W$-measure, such that it has marginals of positive operator-valued measures (POVMs). We prove that ${W}$-measure is a POVM {em if and only if} its marginal POVMs are jointly measurable. The proof suggests to employ the negatives of ${W}$-measure as an indicator for non-joint measurability. By applying triangle inequalities to the negativity, we derive joint measurability criteria for dichotomic and trichotomic variables. Also, we propose an operational test for the joint measurability in sequential measurement scenario.
The successful implementation of algorithms on quantum processors relies on the accurate control of quantum bits (qubits) to perform logic gate operations. In this era of noisy intermediate-scale quantum (NISQ) computing, systematic miscalibrations, drift, and crosstalk in the control of qubits can lead to a coherent form of error which has no classical analog. Coherent errors severely limit the performance of quantum algorithms in an unpredictable manner, and mitigating their impact is necessary for realizing reliable quantum computations. Moreover, the average error rates measured by randomized benchmarking and related protocols are not sensitive to the full impact of coherent errors, and therefore do not reliably predict the global performance of quantum algorithms, leaving us unprepared to validate the accuracy of future large-scale quantum computations. Randomized compiling is a protocol designed to overcome these performance limitations by converting coherent errors into stochastic noise, dramatically reducing unpredictable errors in quantum algorithms and enabling accurate predictions of algorithmic performance from error rates measured via cycle benchmarking. In this work, we demonstrate significant performance gains under randomized compiling for the four-qubit quantum Fourier transform algorithm and for random circuits of variable depth on a superconducting quantum processor. Additionally, we accurately predict algorithm performance using experimentally-measured error rates. Our results demonstrate that randomized compiling can be utilized to leverage and predict the capabilities of modern-day noisy quantum processors, paving the way forward for scalable quantum computing.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا