ترغب بنشر مسار تعليمي؟ اضغط هنا

Hybrid Quantum-Classical Graph Convolutional Network

139   0   0.0 ( 0 )
 نشر من قبل Samuel Yen-Chi Chen
 تاريخ النشر 2021
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

The high energy physics (HEP) community has a long history of dealing with large-scale datasets. To manage such voluminous data, classical machine learning and deep learning techniques have been employed to accelerate physics discovery. Recent advances in quantum machine learning (QML) have indicated the potential of applying these techniques in HEP. However, there are only limited results in QML applications currently available. In particular, the challenge of processing sparse data, common in HEP datasets, has not been extensively studied in QML models. This research provides a hybrid quantum-classical graph convolutional network (QGCNN) for learning HEP data. The proposed framework demonstrates an advantage over classical multilayer perceptron and convolutional neural networks in the aspect of number of parameters. Moreover, in terms of testing accuracy, the QGCNN shows comparable performance to a quantum convolutional neural network on the same HEP dataset while requiring less than $50%$ of the parameters. Based on numerical simulation results, studying the application of graph convolutional operations and other QML models may prove promising in advancing HEP research and other scientific fields.



قيم البحث

اقرأ أيضاً

Graph Convolutional Networks (GCNs) have received increasing attention in the machine learning community for effectively leveraging both the content features of nodes and the linkage patterns across graphs in various applications. As real-world graph s are often incomplete and noisy, treating them as ground-truth information, which is a common practice in most GCNs, unavoidably leads to sub-optimal solutions. Existing efforts for addressing this problem either involve an over-parameterized model which is difficult to scale, or simply re-weight observed edges without dealing with the missing-edge issue. This paper proposes a novel framework called Graph-Revised Convolutional Network (GRCN), which avoids both extremes. Specifically, a GCN-based graph revision module is introduced for predicting missing edges and revising edge weights w.r.t. downstream tasks via joint optimization. A theoretical analysis reveals the connection between GRCN and previous work on multigraph belief propagation. Experiments on six benchmark datasets show that GRCN consistently outperforms strong baseline methods by a large margin, especially when the original graphs are severely incomplete or the labeled instances for model training are highly sparse.
Telecommunication networks play a critical role in modern society. With the arrival of 5G networks, these systems are becoming even more diversified, integrated, and intelligent. Traffic forecasting is one of the key components in such a system, howe ver, it is particularly challenging due to the complex spatial-temporal dependency. In this work, we consider this problem from the aspect of a cellular network and the interactions among its base stations. We thoroughly investigate the characteristics of cellular network traffic and shed light on the dependency complexities based on data collected from a densely populated metropolis area. Specifically, we observe that the traffic shows both dynamic and static spatial dependencies as well as diverse cyclic temporal patterns. To address these complexities, we propose an effective deep-learning-based approach, namely, Spatio-Temporal Hybrid Graph Convolutional Network (STHGCN). It employs GRUs to model the temporal dependency, while capturing the complex spatial dependency through a hybrid-GCN from three perspectives: spatial proximity, functional similarity, and recent trend similarity. We conduct extensive experiments on real-world traffic datasets collected from telecommunication networks. Our experimental results demonstrate the superiority of the proposed model in that it consistently outperforms both classical methods and state-of-the-art deep learning models, while being more robust and stable.
Deep learning has been shown to be able to recognize data patterns better than humans in specific circumstances or contexts. In parallel, quantum computing has demonstrated to be able to output complex wave functions with a few number of gate operati ons, which could generate distributions that are hard for a classical computer to produce. Here we propose a hybrid quantum-classical convolutional neural network (QCCNN), inspired by convolutional neural networks (CNNs) but adapted to quantum computing to enhance the feature mapping process. QCCNN is friendly to currently noisy intermediate-scale quantum computers, in terms of both number of qubits as well as circuits depths, while retaining important features of classical CNN, such as nonlinearity and scalability. We also present a framework to automatically compute the gradients of hybrid quantum-classical loss functions which could be directly applied to other hybrid quantum-classical algorithms. We demonstrate the potential of this architecture by applying it to a Tetris dataset, and show that QCCNN can accomplish classification tasks with learning accuracy surpassing that of classical CNN.
89 - Ian Walker , Ben Glocker 2019
We propose a novel Bayesian nonparametric method to learn translation-invariant relationships on non-Euclidean domains. The resulting graph convolutional Gaussian processes can be applied to problems in machine learning for which the input observatio ns are functions with domains on general graphs. The structure of these models allows for high dimensional inputs while retaining expressibility, as is the case with convolutional neural networks. We present applications of graph convolutional Gaussian processes to images and triangular meshes, demonstrating their versatility and effectiveness, comparing favorably to existing methods, despite being relatively simple models.
Inspired by the success of classical neural networks, there has been tremendous effort to develop classical effective neural networks into quantum concept. In this paper, a novel hybrid quantum-classical neural network with deep residual learning (Re s-HQCNN) is proposed. We firstly analysis how to connect residual block structure with a quantum neural network, and give the corresponding training algorithm. At the same time, the advantages and disadvantages of transforming deep residual learning into quantum concept are provided. As a result, the model can be trained in an end-to-end fashion, analogue to the backpropagation in classical neural networks. To explore the effectiveness of Res-HQCNN , we perform extensive experiments for quantum data with or without noisy on classical computer. The experimental results show the Res-HQCNN performs better to learn an unknown unitary transformation and has stronger robustness for noisy data, when compared to state of the arts. Moreover, the possible methods of combining residual learning with quantum neural networks are also discussed.

الأسئلة المقترحة

التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا