Existence, Stability And Scalability Of Orthogonal Convolutional Neural Networks


الملخص بالإنكليزية

Imposing orthogonal transformations between layers of a neural network has been considered for several years now. This facilitates their learning, by limiting the explosion/vanishing of the gradient; decorrelates the features; improves the robustness. In this framework, this paper studies theoretical properties of orthogonal convolutional layers. More precisely, we establish necessary and sufficient conditions on the layer architecture guaranteeing the existence of an orthogonal convolutional transform. These conditions show that orthogonal convolutional transforms exist for almost all architectures used in practice. Recently, a regularization term imposing the orthogonality of convolutional layers has been proposed. We make the link between this regularization term and orthogonality measures. In doing so, we show that this regularization strategy is stable with respect to numerical and optimization errors and remains accurate when the size of the signals/images is large. This holds for both row and column orthogonality. Finally, we confirm these theoretical results with experiments, and also empirically study the landscape of the regularization term.

تحميل البحث