ترغب بنشر مسار تعليمي؟ اضغط هنا

Exploring Quantum Perceptron and Quantum Neural Network structures with a teacher-student scheme

72   0   0.0 ( 0 )
 نشر من قبل Aikaterini (Katerina) Gratsea
 تاريخ النشر 2021
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Near-term quantum devices can be used to build quantum machine learning models, such as quantum kernel methods and quantum neural networks (QNN) to perform classification tasks. There have been many proposals how to use variational quantum circuits as quantum perceptrons or as QNNs. The aim of this work is to systematically compare different QNN architectures and to evaluate their relative expressive power with a teacher-student scheme. Specifically, the teacher model generates the datasets mapping random inputs to outputs which then have to be learned by the student models. This way, we avoid training on arbitrary data sets and allow to compare the learning capacity of different models directly via the loss, the prediction map, the accuracy and the relative entropy between the prediction maps. We focus particularly on a quantum perceptron model inspired by the recent work of Tacchino et. al. cite{Tacchino1} and compare it to the data re-uploading scheme that was originally introduced by Perez-Salinas et. al. cite{data_re-uploading}. We discuss alterations of the perceptron model and the formation of deep QNN to better understand the role of hidden units and non-linearities in these architectures.



قيم البحث

اقرأ أيضاً

82 - Fei Ye , Adrian G. Bors 2021
A unique cognitive capability of humans consists in their ability to acquire new knowledge and skills from a sequence of experiences. Meanwhile, artificial intelligence systems are good at learning only the last given task without being able to remem ber the databases learnt in the past. We propose a novel lifelong learning methodology by employing a Teacher-Student network framework. While the Student module is trained with a new given database, the Teacher module would remind the Student about the information learnt in the past. The Teacher, implemented by a Generative Adversarial Network (GAN), is trained to preserve and replay past knowledge corresponding to the probabilistic representations of previously learn databases. Meanwhile, the Student module is implemented by a Variational Autoencoder (VAE) which infers its latent variable representation from both the output of the Teacher module as well as from the newly available database. Moreover, the Student module is trained to capture both continuous and discrete underlying data representations across different domains. The proposed lifelong learning framework is applied in supervised, semi-supervised and unsupervised training. The code is available~: url{https://github.com/dtuzi123/Lifelong-Teacher-Student-Network-Learning}
We demonstrate how quantum computation can provide non-trivial improvements in the computational and statistical complexity of the perceptron model. We develop two quantum algorithms for perceptron learning. The first algorithm exploits quantum infor mation processing to determine a separating hyperplane using a number of steps sublinear in the number of data points $N$, namely $O(sqrt{N})$. The second algorithm illustrates how the classical mistake bound of $O(frac{1}{gamma^2})$ can be further improved to $O(frac{1}{sqrt{gamma}})$ through quantum means, where $gamma$ denotes the margin. Such improvements are achieved through the application of quantum amplitude amplification to the version space interpretation of the perceptron model.
Noise and decoherence are two major obstacles to the implementation of large-scale quantum computing. Because of the no-cloning theorem, which says we cannot make an exact copy of an arbitrary quantum state, simple redundancy will not work in a quant um context, and unwanted interactions with the environment can destroy coherence and thus the quantum nature of the computation. Because of the parallel and distributed nature of classical neural networks, they have long been successfully used to deal with incomplete or damaged data. In this work, we show that our model of a quantum neural network (QNN) is similarly robust to noise, and that, in addition, it is robust to decoherence. Moreover, robustness to noise and decoherence is not only maintained but improved as the size of the system is increased. Noise and decoherence may even be of advantage in training, as it helps correct for overfitting. We demonstrate the robustness using entanglement as a means for pattern storage in a qubit array. Our results provide evidence that machine learning approaches can obviate otherwise recalcitrant problems in quantum computing.
We propose to use neural networks to estimate the rates of coherent and incoherent processes in quantum systems from continuous measurement records. In particular, we adapt an image recognition algorithm to recognize the patterns in experimental sign als and link them to physical quantities. We demonstrate that the parameter estimation works unabatedly in the presence of detector imperfections which complicate or rule out Bayesian filter analyses.
Utilizing quantum computers to deploy artificial neural networks (ANNs) will bring the potential of significant advancements in both speed and scale. In this paper, we propose a kind of quantum spike neural networks (SNNs) as well as comprehensively evaluate and give a detailed mathematical proof for the quantum SNNs, including its successful probability, calculation accuracy, and algorithm complexity. The proof shows the quantum SNNs computational complexity that is log-polynomial in the data dimension. Furthermore, we provide a method to improve quantum SNNs minimum successful probability to nearly 100%. Finally, we present the good performance of quantum SNNs for solving pattern recognition from the real-world.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا