ترغب بنشر مسار تعليمي؟ اضغط هنا

Detection of intensity bursts using Hawkes processes: an application to high frequency financial data

87   0   0.0 ( 0 )
 نشر من قبل Marcello Rambaldi
 تاريخ النشر 2016
  مجال البحث مالية
والبحث باللغة English




اسأل ChatGPT حول البحث

Given a stationary point process, an intensity burst is defined as a short time period during which the number of counts is larger than the typical count rate. It might signal a local non-stationarity or the presence of an external perturbation to the system. In this paper we propose a novel procedure for the detection of intensity bursts within the Hawkes process framework. By using a model selection scheme we show that our procedure can be used to detect intensity bursts when both their occurrence time and their total number is unknown. Moreover, the initial time of the burst can be determined with a precision given by the typical inter-event time. We apply our methodology to the mid-price change in FX markets showing that these bursts are frequent and that only a relatively small fraction is associated to news arrival. We show lead-lag relations in intensity burst occurrence across different FX rates and we discuss their relation with price jumps.



قيم البحث

اقرأ أيضاً

We consider a 2-dimensional marked Hawkes process with increasing baseline intensity in order to model prices on electricity intraday markets. This model allows to represent different empirical facts such as increasing market activity, random jump si zes but above all microstructure noise through the signature plot. This last feature is of particular importance for practitioners and has not yet been modeled on those particular markets. We provide analytic formulas for first and second moments and for the signature plot, extending the classic results of Bacry et al. (2013) in the context of Hawkes processes with random jump sizes and time dependent baseline intensity. The tractable model we propose is estimated on German data and seems to fit the data well. We also provide a result about the convergence of the price process to a Brownian motion with increasing volatility at macroscopic scales, highlighting the Samuelson effect.
The analysis of the intraday dynamics of correlations among high-frequency returns is challenging due to the presence of asynchronous trading and market microstructure noise. Both effects may lead to significant data reduction and may severely undere stimate correlations if traditional methods for low-frequency data are employed. We propose to model intraday log-prices through a multivariate local-level model with score-driven covariance matrices and to treat asynchronicity as a missing value problem. The main advantages of this approach are: (i) all available data are used when filtering correlations, (ii) market microstructure noise is taken into account, (iii) estimation is performed through standard maximum likelihood methods. Our empirical analysis, performed on 1-second NYSE data, shows that opening hours are dominated by idiosyncratic risk and that a market factor progressively emerges in the second part of the day. The method can be used as a nowcasting tool for high-frequency data, allowing to study the real-time response of covariances to macro-news announcements and to build intraday portfolios with very short optimization horizons.
We propose the Hawkes flocking model that assesses systemic risk in high-frequency processes at the two perspectives -- endogeneity and interactivity. We examine the futures markets of WTI crude oil and gasoline for the past decade, and perform a com parative analysis with conditional value-at-risk as a benchmark measure. In terms of high-frequency structure, we derive the empirical findings. The endogenous systemic risk in WTI was significantly higher than that in gasoline, and the level at which gasoline affects WTI was constantly higher than in the opposite case. Moreover, although the relative influences degree was asymmetric, its difference has gradually reduced.
Many fits of Hawkes processes to financial data look rather good but most of them are not statistically significant. This raises the question of what part of market dynamics this model is able to account for exactly. We document the accuracy of such processes as one varies the time interval of calibration and compare the performance of various types of kernels made up of sums of exponentials. Because of their around-the-clock opening times, FX markets are ideally suited to our aim as they allow us to avoid the complications of the long daily overnight closures of equity markets. One can achieve statistical significance according to three simultaneous tests provided that one uses kernels with two exponentials for fitting an hour at a time, and two or three exponentials for full days, while longer periods could not be fitted within statistical satisfaction because of the non-stationarity of the endogenous process. Fitted timescales are relatively short and endogeneity factor is high but sub-critical at about 0.8.
This work is devoted to the study of modeling geophysical and financial time series. A class of volatility models with time-varying parameters is presented to forecast the volatility of time series in a stationary environment. The modeling of station ary time series with consistent properties facilitates prediction with much certainty. Using the GARCH and stochastic volatility model, we forecast one-step-ahead suggested volatility with +/- 2 standard prediction errors, which is enacted via Maximum Likelihood Estimation. We compare the stochastic volatility model relying on the filtering technique as used in the conditional volatility with the GARCH model. We conclude that the stochastic volatility is a better forecasting tool than GARCH (1, 1), since it is less conditioned by autoregressive past information.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا