Min-Max-Plus Neural Networks


الملخص بالإنكليزية

We present a new model of neural networks called Min-Max-Plus Neural Networks (MMP-NNs) based on operations in tropical arithmetic. In general, an MMP-NN is composed of three types of alternately stacked layers, namely linear layers, min-plus layers and max-plus layers. Specifically, the latter two types of layers constitute the nonlinear part of the network which is trainable and more sophisticated compared to the nonlinear part of conventional neural networks. In addition, we show that with higher capability of nonlinearity expression, MMP-NNs are universal approximators of continuous functions, even when the number of multiplication operations is tremendously reduced (possibly to none in certain extreme cases). Furthermore, we formulate the backpropagation algorithm in the training process of MMP-NNs and introduce an algorithm of normalization to improve the rate of convergence in training.

تحميل البحث