Do you want to publish a course? Click here

Machine learning of noise-resilient quantum circuits

92   0   0.0 ( 0 )
 Added by Mohan Sarovar
 Publication date 2020
  fields Physics
and research's language is English




Ask ChatGPT about the research

Noise mitigation and reduction will be crucial for obtaining useful answers from near-term quantum computers. In this work, we present a general framework based on machine learning for reducing the impact of quantum hardware noise on quantum circuits. Our method, called noise-aware circuit learning (NACL), applies to circuits designed to compute a unitary transformation, prepare a set of quantum states, or estimate an observable of a many-qubit state. Given a task and a device model that captures information about the noise and connectivity of qubits in a device, NACL outputs an optimized circuit to accomplish this task in the presence of noise. It does so by minimizing a task-specific cost function over circuit depths and circuit structures. To demonstrate NACL, we construct circuits resilient to a fine-grained noise model derived from gate set tomography on a superconducting-circuit quantum device, for applications including quantum state overlap, quantum Fourier transform, and W-state preparation.



rate research

Read More

Current implementations of quantum logic gates can be highly faulty and introduce errors. In order to correct these errors, it is necessary to first identify the faulty gates. We demonstrate a procedure to diagnose where gate faults occur in a circuit by using a hybridized quantum-and-classical K-Nearest-Neighbors (KNN) machine-learning technique. We accomplish this task using a diagnostic circuit and selected input qubits to obtain the fidelity between a set of output states and reference states. The outcomes of the circuit can then be stored to be used for a classical KNN algorithm. We numerically demonstrate an ability to locate a faulty gate in circuits with over 30 gates and up to nine qubits with over 90% accuracy.
132 - Mingxia Huo , Ying Li 2021
Computing the ground-state properties of quantum many-body systems is a promising application of near-term quantum hardware with a potential impact in many fields. Quantum phase estimation uses deep circuits and is infeasible without fault-tolerant technologies. Many quantum simulation algorithms developed recently work in an inexact and variational manner to exploit the power of shallow circuits. These algorithms rely on the assumption that variational circuits can produce the desired result. Here, we combine quantum Monte Carlo with quantum computing and propose a quasi-exact algorithm for imaginary-time simulation and ground-state computing. Unlike variational algorithms, our algorithm always approaches the exact solution when the Trotter circuit depth increases. Even when the circuit is shallow, our algorithm can yield an accurate ground-state energy. Compared with quantum phase estimation, the conventional quasi-exact algorithm, our algorithm can reduce the Trotter step number by thousands of times. We verify this resilience to Trotterisation errors in numerical simulation of up to 20 qubits and theoretical analysis. Our results demonstrate that non-variational and exact quantum simulation is promising even without a fully fault-tolerant quantum computer.
Realistic quantum computing is subjected to noise. A most important frontier in research of quantum computing is to implement noise-resilient quantum control over qubits. Dynamical decoupling can protect coherence of qubits. Here we demonstrate non-trivial quantum evolution steered by dynamical decoupling control, which automatically suppresses the noise effect. We designed and implemented a self-protected controlled-NOT gate on the electron spin of a nitrogen-vacancy centre and a nearby carbon-13 nuclear spin in diamond at room temperature, by employing an engineered dynamical decoupling control on the electron spin. Final state fidelities of 0.91 and 0.88 were observed even with imperfect initial states. In the mean time, the qubit coherence time has been elongated by at least 30 folds. The design scheme does not require that the dynamical decoupling control commute with the qubit interaction and works for general systems. This work marks a step toward realistic quantum computing.
A significant problem for current quantum computers is noise. While there are many distinct noise channels, the depolarizing noise model often appropriately describes average noise for large circuits involving many qubits and gates. We present a method to mitigate the depolarizing noise by first estimating its rate with a noise-estimation circuit and then correcting the output of the target circuit using the estimated rate. The method is experimentally validated on the simulation of the Heisenberg model. We find that our approach in combination with readout-error correction, randomized compiling, and zero-noise extrapolation produces results close to exact results even for circuits containing hundreds of CNOT gates.
Machine learning is seen as a promising application of quantum computation. For near-term noisy intermediate-scale quantum (NISQ) devices, parametrized quantum circuits (PQCs) have been proposed as machine learning models due to their robustness and ease of implementation. However, the cost function is normally calculated classically from repeated measurement outcomes, such that it is no longer encoded in a quantum state. This prevents the value from being directly manipulated by a quantum computer. To solve this problem, we give a routine to embed the cost function for machine learning into a quantum circuit, which accepts a training dataset encoded in superposition or an easily preparable mixed state. We also demonstrate the ability to evaluate the gradient of the encoded cost function in a quantum state.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا