ﻻ يوجد ملخص باللغة العربية
Regularizers helped deep neural networks prevent feature co-adaptations. Dropout,as a commonly used regularization technique, stochastically disables neuron ac-tivations during network optimization. However, such complete feature disposal can affect the feature representation and network understanding. Toward betterdescriptions of latent representations, we present DropGraph that learns regularization function by constructing a stand-alone graph from the backbone features. DropGraph first samples stochastic spatial feature vectors and then incorporates graph reasoning methods to generate feature map distortions. This add-on graph regularizes the network during training and can be completely skipped during inference. We provide intuitions on the linkage between graph reasoning andDropout with further discussions on how partial graph reasoning method reduces feature correlations. To this end, we extensively study the modeling of graphvertex dependencies and the utilization of the graph for distorting backbone featuremaps. DropGraph was validated on four tasks with a total of 7 different datasets.The experimental results show that our method outperforms other state-of-the-art regularizers while leaving the base model structure unmodified during inference.
The graph Laplacian regularization term is usually used in semi-supervised representation learning to provide graph structure information for a model $f(X)$. However, with the recent popularity of graph neural networks (GNNs), directly encoding graph
The complexity and non-Euclidean structure of graph data hinder the development of data augmentation methods similar to those in computer vision. In this paper, we propose a feature augmentation method for graph nodes based on topological regularizat
Interpretable brain network models for disease prediction are of great value for the advancement of neuroscience. GNNs are promising to model complicated network data, but they are prone to overfitting and suffer from poor interpretability, which pre
Effectively combining logic reasoning and probabilistic inference has been a long-standing goal of machine learning: the former has the ability to generalize with small training data, while the latter provides a principled framework for dealing with
We describe an adversarial learning approach to constrain convolutional neural network training for image registration, replacing heuristic smoothness measures of displacement fields often used in these tasks. Using minimally-invasive prostate cancer