ترغب بنشر مسار تعليمي؟ اضغط هنا

Classifying single-qubit noise using machine learning

130   0   0.0 ( 0 )
 نشر من قبل Travis Scholten
 تاريخ النشر 2019
والبحث باللغة English




اسأل ChatGPT حول البحث

Quantum characterization, validation, and verification (QCVV) techniques are used to probe, characterize, diagnose, and detect errors in quantum information processors (QIPs). An important component of any QCVV protocol is a mapping from experimental data to an estimate of a property of a QIP. Machine learning (ML) algorithms can help automate the development of QCVV protocols, creating such maps by learning them from training data. We identify the critical components of machine-learned QCVV techniques, and present a rubric for developing them. To demonstrate this approach, we focus on the problem of determining whether noise affecting a single qubit is coherent or stochastic (incoherent) using the data sets originally proposed for gate set tomography. We leverage known ML algorithms to train a classifier distinguishing these two kinds of noise. The accuracy of the classifier depends on how well it can approximate the natural geometry of the training data. We find GST data sets generated by a noisy qubit can reliably be separated by linear surfaces, although feature engineering can be necessary. We also show the classifier learned by a support vector machine (SVM) is robust under finite-sample noise.



قيم البحث

اقرأ أيضاً

Spatial modes of light constitute valuable resources for a variety of quantum technologies ranging from quantum communication and quantum imaging to remote sensing. Nevertheless, their vulnerabilities to phase distortions, induced by random media, im pose significant limitations on the realistic implementation of numerous quantum-photonic technologies. Unfortunately, this problem is exacerbated at the single-photon level. Over the last two decades, this challenging problem has been tackled through conventional schemes that utilize optical nonlinearities, quantum correlations, and adaptive optics. In this article, we exploit the self-learning and self-evolving features of artificial neural networks to correct the complex spatial profile of distorted Laguerre-Gaussian modes at the single-photon level. Furthermore, we demonstrate the possibility of boosting the performance of an optical communication protocol through the spatial mode correction of single photons using machine learning. Our results have important implications for real-time turbulence correction of structured photons and single-photon images.
Kernel methods are used extensively in classical machine learning, especially in the field of pattern analysis. In this paper, we propose a kernel-based quantum machine learning algorithm that can be implemented on a near-term, intermediate scale qua ntum device. Our proposal is based on estimating classically intractable kernel functions, using a restricted quantum model known as deterministic quantum computing with one qubit. Our method provides a framework for studying the role of quantum correlations other than quantum entanglement for machine learning applications.
Noise mitigation and reduction will be crucial for obtaining useful answers from near-term quantum computers. In this work, we present a general framework based on machine learning for reducing the impact of quantum hardware noise on quantum circuits . Our method, called noise-aware circuit learning (NACL), applies to circuits designed to compute a unitary transformation, prepare a set of quantum states, or estimate an observable of a many-qubit state. Given a task and a device model that captures information about the noise and connectivity of qubits in a device, NACL outputs an optimized circuit to accomplish this task in the presence of noise. It does so by minimizing a task-specific cost function over circuit depths and circuit structures. To demonstrate NACL, we construct circuits resilient to a fine-grained noise model derived from gate set tomography on a superconducting-circuit quantum device, for applications including quantum state overlap, quantum Fourier transform, and W-state preparation.
A precise measurement of dephasing over a range of timescales is critical for improving quantum gates beyond the error correction threshold. We present a metrological tool, based on randomized benchmarking, capable of greatly increasing the precision of Ramsey and spin echo sequences by the repeated but incoherent addition of phase noise. We find our SQUID-based qubit is not limited by $1/f$ flux noise at short timescales, but instead observe a telegraph noise mechanism that is not amenable to study with standard measurement techniques.
Precise nanofabrication represents a critical challenge to developing semiconductor quantum-dot qubits for practical quantum computation. Here, we design and train a convolutional neural network to interpret in-line scanning electron micrographs and quantify qualitative features affecting device functionality. The high-throughput strategy is exemplified by optimizing a model lithographic process within a five-dimensional design space and by demonstrating a new approach to address lithographic proximity effects. The present results emphasize the benefits of machine learning for developing robust processes, shortening development cycles, and enforcing quality control during qubit fabrication.

الأسئلة المقترحة

التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا