ﻻ يوجد ملخص باللغة العربية
We demonstrate how quantum computation can provide non-trivial improvements in the computational and statistical complexity of the perceptron model. We develop two quantum algorithms for perceptron learning. The first algorithm exploits quantum information processing to determine a separating hyperplane using a number of steps sublinear in the number of data points $N$, namely $O(sqrt{N})$. The second algorithm illustrates how the classical mistake bound of $O(frac{1}{gamma^2})$ can be further improved to $O(frac{1}{sqrt{gamma}})$ through quantum means, where $gamma$ denotes the margin. Such improvements are achieved through the application of quantum amplitude amplification to the version space interpretation of the perceptron model.
Quantum machine learning algorithms could provide significant speed-ups over their classical counterparts; however, whether they could also achieve good generalization remains unclear. Recently, two quantum perceptron models which give a quadratic im
Perceptrons, which perform binary classification, are the fundamental building blocks of neural networks. Given a data set of size~$N$ and margin~$gamma$ (how well the given data are separated), the query complexity of the best-known quantum training
We demonstrate that it is possible to implement a quantum perceptron with a sigmoid activation function as an efficient, reversible many-body unitary operation. When inserted in a neural network, the perceptrons response is parameterized by the poten
The quantum perceptron is a fundamental building block for quantum machine learning. This is a multidisciplinary field that incorporates abilities of quantum computing, such as state superposition and entanglement, to classical machine learning schem
Near-term quantum devices can be used to build quantum machine learning models, such as quantum kernel methods and quantum neural networks (QNN) to perform classification tasks. There have been many proposals how to use variational quantum circuits a