ﻻ يوجد ملخص باللغة العربية
Perceptrons, which perform binary classification, are the fundamental building blocks of neural networks. Given a data set of size~$N$ and margin~$gamma$ (how well the given data are separated), the query complexity of the best-known quantum training algorithm scales as either $( icefrac{sqrt{N}}{gamma^2})log( icefrac1{gamma^2)}$ or $ icefrac{N}{sqrt{gamma}}$, which is achieved by a hybrid of classical and quantum search. In this paper, we improve the version space quantum training method for perceptrons such that the query complexity of our algorithm scales as $sqrt{ icefrac{N}{gamma}}$. This is achieved by constructing an oracle for the perceptrons using quantum counting of the number of data elements that are correctly classified. We show that query complexity to construct such an oracle has a quadratic improvement over classical methods. Once such an oracle is constructed, bounded-error quantum search can be used to search over the hyperplane instances. The optimality of our algorithm is proven by reducing the evaluation of a two-level AND-OR tree (for which the query complexity lower bound is known) to a multi-criterion search. Our quantum training algorithm can be generalized to train more complex machine learning models such as neural networks, which are built on a large number of perceptrons.
We demonstrate how quantum computation can provide non-trivial improvements in the computational and statistical complexity of the perceptron model. We develop two quantum algorithms for perceptron learning. The first algorithm exploits quantum infor
A recent breakthrough by Ambainis, Balodis, Iraids, Kokainis, Pr=usis and Vihrovs (SODA19) showed how to construct faster quantum algorithms for the Traveling Salesman Problem and a few other NP-hard problems by combining in a novel way quantum searc
One of the main milestones in quantum information science is to realise quantum devices that exhibit an exponential computational advantage over classical ones without being universal quantum computers, a state of affairs dubbed quantum speedup, or s
Quantum machine learning algorithms could provide significant speed-ups over their classical counterparts; however, whether they could also achieve good generalization remains unclear. Recently, two quantum perceptron models which give a quadratic im
With quantum computers of significant size now on the horizon, we should understand how to best exploit their initially limited abilities. To this end, we aim to identify a practical problem that is beyond the reach of current classical computers, bu