Neural Network for Weighted Signal Temporal Logic


Abstract in English

In this paper, we propose a neuro-symbolic framework called weighted Signal Temporal Logic Neural Network (wSTL-NN) that combines the characteristics of neural networks and temporal logics. Weighted Signal Temporal Logic (wSTL) formulas are recursively composed of subformulas that are combined using logical and temporal operators. The quantitative semantics of wSTL is defined such that the quantitative satisfaction of subformulas with higher weights has more influence on the quantitative satisfaction of the overall wSTL formula. In the wSTL-NN, each neuron corresponds to a wSTL subformula, and its output corresponds to the quantitative satisfaction of the formula. We use wSTL-NN to represent wSTL formulas as features to classify time series data. STL features are more explainable than those used in classical methods. The wSTL-NN is end-to-end differentiable, which allows learning of wSTL formulas to be done using back-propagation. To reduce the number of weights, we introduce two techniques to sparsify the wSTL-NN.We apply our framework to an occupancy detection time-series dataset to learn a classifier that predicts the occupancy status of an office room.

Download