ترغب بنشر مسار تعليمي؟ اضغط هنا

Quantum Speedup in Adaptive Boosting of Binary Classification

154   0   0.0 ( 0 )
 نشر من قبل Min-Hsiu Hsieh
 تاريخ النشر 2019
والبحث باللغة English




اسأل ChatGPT حول البحث

In classical machine learning, a set of weak classifiers can be adaptively combined to form a strong classifier for improving the overall performance, a technique called adaptive boosting (or AdaBoost). However, constructing the strong classifier for a large data set is typically resource consuming. Here we propose a quantum extension of AdaBoost, demonstrating a quantum algorithm that can output the optimal strong classifier with a quadratic speedup in the number of queries of the weak classifiers. Our results also include a generalization of the standard AdaBoost to the cases where the output of each classifier may be probabilistic even for the same input. We prove that the update rules and the query complexity of the non-deterministic classifiers are the same as those of deterministic classifiers, which may be of independent interest to the classical machine-learning community. Furthermore, the AdaBoost algorithm can also be applied to data encoded in the form of quantum states; we show how the training set can be simplified by using the tools of t-design. Our approach describes a model of quantum machine learning where quantum speedup is achieved in finding the optimal classifier, which can then be applied for classical machine-learning applications.



قيم البحث

اقرأ أيضاً

Boosting is a general method to convert a weak learner (which generates hypotheses that are just slightly better than random) into a strong learner (which generates hypotheses that are much better than random). Recently, Arunachalam and Maity gave th e first quantum improvement for boosting, by combining Freund and Schapires AdaBoost algorithm with a quantum algorithm for approximate counting. Their booster is faster than classical boosting as a function of the VC-dimension of the weak learners hypothesis class, but worse as a function of the quality of the weak learner. In this paper we give a substantially faster and simpler quantum boosting algorithm, based on Servedios SmoothBoost algorithm.
Perceptrons, which perform binary classification, are the fundamental building blocks of neural networks. Given a data set of size~$N$ and margin~$gamma$ (how well the given data are separated), the query complexity of the best-known quantum training algorithm scales as either $( icefrac{sqrt{N}}{gamma^2})log( icefrac1{gamma^2)}$ or $ icefrac{N}{sqrt{gamma}}$, which is achieved by a hybrid of classical and quantum search. In this paper, we improve the version space quantum training method for perceptrons such that the query complexity of our algorithm scales as $sqrt{ icefrac{N}{gamma}}$. This is achieved by constructing an oracle for the perceptrons using quantum counting of the number of data elements that are correctly classified. We show that query complexity to construct such an oracle has a quadratic improvement over classical methods. Once such an oracle is constructed, bounded-error quantum search can be used to search over the hyperplane instances. The optimality of our algorithm is proven by reducing the evaluation of a two-level AND-OR tree (for which the query complexity lower bound is known) to a multi-criterion search. Our quantum training algorithm can be generalized to train more complex machine learning models such as neural networks, which are built on a large number of perceptrons.
Quantum speed limit time (QSLT) can be used to characterize the intrinsic minimal time interval for a quantum system evolving from an initial state to a target state. We investigate the QSLT of the open system in Schwarzschild space-time. We show tha t, in some typical noisy channels,the Hawking effect can be beneficial to the evolution of the system. For an initial entangled state, the evolution speed of the system can be enhanced in the depolarizing, bit flip, and bit-phase flip channels as the Hawking temperature increases, which are in sharp contrast to the phase flip channel. Moreover, the optimal initial entanglement exists in other noise channels except the phase flip channel, which minimizes the QSLT of the system and thus leads to the maximum evolution speed of the system.
With quantum computers of significant size now on the horizon, we should understand how to best exploit their initially limited abilities. To this end, we aim to identify a practical problem that is beyond the reach of current classical computers, bu t that requires the fewest resources for a quantum computer. We consider quantum simulation of spin systems, which could be applied to understand condensed matter phenomena. We synthesize explicit circuits for three leading quantum simulation algorithms, employing diverse techniques to tighten error bounds and optimize circuit implementations. Quantum signal processing appears to be preferred among algorithms with rigorous performance guarantees, whereas higher-order product formulas prevail if empirical error estimates suffice. Our circuits are orders of magnitude smaller than those for the simplest classically-infeasible instances of factoring and quantum chemistry.
Machine Learning classification models learn the relation between input as features and output as a class in order to predict the class for the new given input. Quantum Mechanics (QM) has already shown its effectiveness in many fields and researchers have proposed several interesting results which cannot be obtained through classical theory. In recent years, researchers have been trying to investigate whether the QM can help to improve the classical machine learning algorithms. It is believed that the theory of QM may also inspire an effective algorithm if it is implemented properly. From this inspiration, we propose the quantum-inspired binary classifier.

الأسئلة المقترحة

التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا