Do you want to publish a course? Click here

State Heterogeneity Analysis of Financial Volatility Using High-Frequency Financial Data

78   0   0.0 ( 0 )
 Added by Dohyun Chun
 Publication date 2021
and research's language is English




Ask ChatGPT about the research

Recently, to account for low-frequency market dynamics, several volatility models, employing high-frequency financial data, have been developed. However, in financial markets, we often observe that financial volatility processes depend on economic states, so they have a state heterogeneous structure. In this paper, to study state heterogeneous market dynamics based on high-frequency data, we introduce a novel volatility model based on a continuous Ito diffusion process whose intraday instantaneous volatility process evolves depending on the exogenous state variable, as well as its integrated volatility. We call it the state heterogeneous GARCH-Ito (SG-Ito) model. We suggest a quasi-likelihood estimation procedure with the realized volatility proxy and establish its asymptotic behaviors. Moreover, to test the low-frequency state heterogeneity, we develop a Wald test-type hypothesis testing procedure. The results of empirical studies suggest the existence of leverage, investor attention, market illiquidity, stock market comovement, and post-holiday effect in S&P 500 index volatility.

rate research

Read More

This work is devoted to the study of modeling geophysical and financial time series. A class of volatility models with time-varying parameters is presented to forecast the volatility of time series in a stationary environment. The modeling of stationary time series with consistent properties facilitates prediction with much certainty. Using the GARCH and stochastic volatility model, we forecast one-step-ahead suggested volatility with +/- 2 standard prediction errors, which is enacted via Maximum Likelihood Estimation. We compare the stochastic volatility model relying on the filtering technique as used in the conditional volatility with the GARCH model. We conclude that the stochastic volatility is a better forecasting tool than GARCH (1, 1), since it is less conditioned by autoregressive past information.
Financial time-series analysis and forecasting have been extensively studied over the past decades, yet still remain as a very challenging research topic. Since the financial market is inherently noisy and stochastic, a majority of financial time-series of interests are non-stationary, and often obtained from different modalities. This property presents great challenges and can significantly affect the performance of the subsequent analysis/forecasting steps. Recently, the Temporal Attention augmented Bilinear Layer (TABL) has shown great performances in tackling financial forecasting problems. In this paper, by taking into account the nature of bilinear projections in TABL networks, we propose Bilinear Normalization (BiN), a simple, yet efficient normalization layer to be incorporated into TABL networks to tackle potential problems posed by non-stationarity and multimodalities in the input series. Our experiments using a large scale Limit Order Book (LOB) consisting of more than 4 million order events show that BiN-TABL outperforms TABL networks using other state-of-the-arts normalization schemes by a large margin.
Several novel statistical methods have been developed to estimate large integrated volatility matrices based on high-frequency financial data. To investigate their asymptotic behaviors, they require a sub-Gaussian or finite high-order moment assumption for observed log-returns, which cannot account for the heavy tail phenomenon of stock returns. Recently, a robust estimator was developed to handle heavy-tailed distributions with some bounded fourth-moment assumption. However, we often observe that log-returns have heavier tail distribution than the finite fourth-moment and that the degrees of heaviness of tails are heterogeneous over the asset and time period. In this paper, to deal with the heterogeneous heavy-tailed distributions, we develop an adaptive robust integrated volatility estimator that employs pre-averaging and truncation schemes based on jump-diffusion processes. We call this an adaptive robust pre-averaging realized volatility (ARP) estimator. We show that the ARP estimator has a sub-Weibull tail concentration with only finite 2$alpha$-th moments for any $alpha>1$. In addition, we establish matching upper and lower bounds to show that the ARP estimation procedure is optimal. To estimate large integrated volatility matrices using the approximate factor model, the ARP estimator is further regularized using the principal orthogonal complement thresholding (POET) method. The numerical study is conducted to check the finite sample performance of the ARP estimator.
Given a stationary point process, an intensity burst is defined as a short time period during which the number of counts is larger than the typical count rate. It might signal a local non-stationarity or the presence of an external perturbation to the system. In this paper we propose a novel procedure for the detection of intensity bursts within the Hawkes process framework. By using a model selection scheme we show that our procedure can be used to detect intensity bursts when both their occurrence time and their total number is unknown. Moreover, the initial time of the burst can be determined with a precision given by the typical inter-event time. We apply our methodology to the mid-price change in FX markets showing that these bursts are frequent and that only a relatively small fraction is associated to news arrival. We show lead-lag relations in intensity burst occurrence across different FX rates and we discuss their relation with price jumps.
53 - G. Rotundo 2006
Technical analysis (TA) has been used for a long time before the availability of more sophisticated instruments for financial forecasting in order to suggest decisions on the basis of the occurrence of data patterns. Many mathematical and statistical tools for quantitative analysis of financial markets have experienced a fast and wide growth and have the power for overcoming classical technical analysis methods. This paper aims to give a measure of the reliability of some information used in TA by exploring the probability of their occurrence within a particular $microeconomic$ agent based model of markets, i.e., the co-evolution Bak-Sneppen model originally invented for describing species population evolutions. After having proved the practical interest of such a model in describing financial index so called avalanches, in the prebursting bubble time rise, the attention focuses on the occurrence of trend line detection crossing of meaningful barriers, those that give rise to some usual technical analysis strategies. The case of the NASDAQ crash of April 2000 serves as an illustration.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا