ﻻ يوجد ملخص باللغة العربية
We present a quantum kernel method for high-dimensional data analysis using Googles universal quantum processor, Sycamore. This method is successfully applied to the cosmological benchmark of supernova classification using real spectral features with no dimensionality reduction and without vanishing kernel elements. Instead of using a synthetic dataset of low dimension or pre-processing the data with a classical machine learning algorithm to reduce the data dimension, this experiment demonstrates that machine learning with real, high dimensional data is possible using a quantum processor; but it requires careful attention to shot statistics and mean kernel element size when constructing a circuit ansatz. Our experiment utilizes 17 qubits to classify 67 dimensional data - significantly higher dimensionality than the largest prior quantum kernel experiments - resulting in classification accuracy that is competitive with noiseless simulation and comparable classical techniques.
Machine learning has emerged as a promising approach to study the properties of many-body systems. Recently proposed as a tool to classify phases of matter, the approach relies on classical simulation methods$-$such as Monte Carlo$-$which are known t
The successful implementation of algorithms on quantum processors relies on the accurate control of quantum bits (qubits) to perform logic gate operations. In this era of noisy intermediate-scale quantum (NISQ) computing, systematic miscalibrations,
The computation of molecular excitation energies is essential for predicting photo-induced reactions of chemical and technological interest. While the classical computing resources needed for this task scale poorly, quantum algorithms emerge as promi
Topological data analysis offers a robust way to extract useful information from noisy, unstructured data by identifying its underlying structure. Recently, an efficient quantum algorithm was proposed [Lloyd, Garnerone, Zanardi, Nat. Commun. 7, 10138
Quantum computation, a completely different paradigm of computing, benefits from theoretically proven speed-ups for certain problems and opens up the possibility of exactly studying the properties of quantum systems. Yet, because of the inherent frag