Do you want to publish a course? Click here

Data classification by quantum radial basis function networks

324   0   0.0 ( 0 )
 Added by Changpeng Shao
 Publication date 2019
and research's language is English




Ask ChatGPT about the research

Radial basis function (RBF) network is a third layered neural network that is widely used in function approximation and data classification. Here we propose a quantum model of the RBF network. Similar to the classical case, we still use the radial basis functions as the activation functions. Quantum linear algebraic techniques and coherent states can be applied to implement these functions. Differently, we define the state of the weight as a tensor product of single-qubit states. This gives a simple approach to implement the quantum RBF network in the quantum circuits. Theoretically, we prove that the training is almost quadratic faster than the classical one. Numerically, we demonstrate that the quantum RBF network can solve binary classification problems as good as the classical RBF network. While the time used for training is much shorter.



rate research

Read More

We investigate the benefits of feature selection, nonlinear modelling and online learning when forecasting in financial time series. We consider the sequential and continual learning sub-genres of online learning. The experiments we conduct show that there is a benefit to online transfer learning, in the form of radial basis function networks, beyond the sequential updating of recursive least-squares models. We show that the radial basis function networks, which make use of clustering algorithms to construct a kernel Gram matrix, are more beneficial than treating each training vector as separate basis functions, as occurs with kernel Ridge regression. We demonstrate quantitative procedures to determine the very structure of the radial basis function networks. Finally, we conduct experiments on the log returns of financial time series and show that the online learning models, particularly the radial basis function networks, are able to outperform a random walk baseline, whereas the offline learning models struggle to do so.
Exotic magnetic structures, such as magnetic skyrmions and domain walls, are becoming more important in nitrogen-vacancy center scanning magnetometry. However, a systematic imaging approach to mapping stray fields with fluctuation of several milliteslas generated by such structures is not yet available. Here we present a scheme to image a millitesla magnetic field by tracking the magnetic resonance frequency, which can record multiple contour lines for a magnetic field. The radial basis function algorithm is employed to reconstruct the magnetic field from the contour lines. Simulations with shot noise quantitatively confirm the high quality of the reconstruction algorithm. The method was validated by imaging the stray field of a frustrated magnet. Our scheme had a maximum detectable magnetic field gradient of 0.86 mT per pixel, which enables the efficient imaging of millitesla magnetic fields.
Emotion recognition (ER) from facial images is one of the landmark tasks in affective computing with major developments in the last decade. Initial efforts on ER relied on handcrafted features that were used to characterize facial images and then feed to standard predictive models. Recent methodologies comprise end-to-end trainable deep learning methods that simultaneously learn both, features and predictive model. Perhaps the most successful models are based on convolutional neural networks (CNNs). While these models have excelled at this task, they still fail at capturing local patterns that could emerge in the learning process. We hypothesize these patterns could be captured by variants based on locally weighted learning. Specifically, in this paper we propose a CNN based architecture enhanced with multiple branches formed by radial basis function (RBF) units that aims at exploiting local information at the final stage of the learning process. Intuitively, these RBF units capture local patterns shared by similar instances using an intermediate representation, then the outputs of the RBFs are feed to a softmax layer that exploits this information to improve the predictive performance of the model. This feature could be particularly advantageous in ER as cultural / ethnicity differences may be identified by the local units. We evaluate the proposed method in several ER datasets and show the proposed methodology achieves state-of-the-art in some of them, even when we adopt a pre-trained VGG-Face model as backbone. We show it is the incorporation of local information what makes the proposed model competitive.
Random features are a central technique for scalable learning algorithms based on kernel methods. A recent work has shown that an algorithm for machine learning by quantum computer, quantum machine learning (QML), can exponentially speed up sampling of optimized random features, even without imposing restrictive assumptions on sparsity and low-rankness of matrices that had limited applicability of conventional QML algorithms; this QML algorithm makes it possible to significantly reduce and provably minimize the required number of features for regression tasks. However, a major interest in the field of QML is how widely the advantages of quantum computation can be exploited, not only in the regression tasks. We here construct a QML algorithm for a classification task accelerated by the optimized random features. We prove that the QML algorithm for sampling optimized random features, combined with stochastic gradient descent (SGD), can achieve state-of-the-art exponential convergence speed of reducing classification error in a classification task under a low-noise condition; at the same time, our algorithm with optimized random features can take advantage of the significant reduction of the required number of features so as to accelerate each iteration in the SGD and evaluation of the classifier obtained from our algorithm. These results discover a promising application of QML to significant acceleration of the leading classification algorithm based on kernel methods, without ruining its applicability to a practical class of data sets and the exponential error-convergence speed.
323 - Seung Ki Baek , Minjae Kim 2017
We numerically solve two-dimensional heat diffusion problems by using a simple variant of the meshfree local radial-basis function (RBF) collocation method. The main idea is to include an additional set of sample nodes outside the problem domain, similarly to the method of images in electrostatics, to perform collocation on the domain boundaries. We can thereby take into account the temperature profile as well as its gradients specified by boundary conditions at the same time, which holds true even for a node where two or more boundaries meet with different boundary conditions. We argue that the image method is computationally efficient when combined with the local RBF collocation method, whereas the addition of image nodes becomes very costly in case of the global collocation. We apply our modified method to a benchmark test of a boundary value problem, and find that this simple modification reduces the maximum error from the analytic solution significantly. The reduction is small for an initial value problem with simpler boundary conditions. We observe increased numerical instability, which has to be compensated for by a sufficient number of sample nodes and/or more careful parameter choices for time integration.

suggested questions

comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا