ﻻ يوجد ملخص باللغة العربية
Financial time-series analysis and forecasting have been extensively studied over the past decades, yet still remain as a very challenging research topic. Since the financial market is inherently noisy and stochastic, a majority of financial time-series of interests are non-stationary, and often obtained from different modalities. This property presents great challenges and can significantly affect the performance of the subsequent analysis/forecasting steps. Recently, the Temporal Attention augmented Bilinear Layer (TABL) has shown great performances in tackling financial forecasting problems. In this paper, by taking into account the nature of bilinear projections in TABL networks, we propose Bilinear Normalization (BiN), a simple, yet efficient normalization layer to be incorporated into TABL networks to tackle potential problems posed by non-stationarity and multimodalities in the input series. Our experiments using a large scale Limit Order Book (LOB) consisting of more than 4 million order events show that BiN-TABL outperforms TABL networks using other state-of-the-arts normalization schemes by a large margin.
Recently, to account for low-frequency market dynamics, several volatility models, employing high-frequency financial data, have been developed. However, in financial markets, we often observe that financial volatility processes depend on economic st
The minute-by-minute move of the Hang Seng Index (HSI) data over a four-year period is analysed and shown to possess similar statistical features as those of other markets. Based on a mathematical theorem [S. B. Pope and E. S. C. Ching, Phys. Fluids
A classic problem in physics is the origin of fat tailed distributions generated by complex systems. We study the distributions of stock returns measured over different time lags $tau.$ We find that destroying all correlations without changing the $t
We study tick-by-tick financial returns belonging to the FTSE MIB index of the Italian Stock Exchange (Borsa Italiana). We can confirm previously detected non-stationarities. However, scaling properties reported in the previous literature for other h
Data normalization is one of the most important preprocessing steps when building a machine learning model, especially when the model of interest is a deep neural network. This is because deep neural network optimized with stochastic gradient descent