ترغب بنشر مسار تعليمي؟ اضغط هنا

We propose an observation-driven time-varying SVAR model where, in agreement with the Lucas Critique, structural shocks drive both the evolution of the macro variables and the dynamics of the VAR parameters. Contrary to existing approaches where para meters follow a stochastic process with random and exogenous shocks, our observation-driven specification allows the evolution of the parameters to be driven by realized past structural shocks, thus opening the possibility to gauge the impact of observed shocks and hypothetical policy interventions on the future evolution of the economic system.
The goal of this paper is to investigate the method outlined by one of us (PR) in Cherubini et al. (2009) to compute option prices. We name it the SINC approach. While the COS method by Fang and Osterlee (2009) leverages the Fourier-cosine expansion of truncated densities, the SINC approach builds on the Shannon Sampling Theorem revisited for functions with bounded support. We provide several results which were missing in the early derivation: i) a rigorous proof of the convergence of the SINC formula to the correct option price when the support grows and the number of Fourier frequencies increases; ii) ready to implement formulas for put, Cash-or-Nothing, and Asset-or-Nothing options; iii) a systematic comparison with the COS formula for several log-price models; iv) a numerical challenge against alternative Fast Fourier specifications, such as Carr and Madan (1999) and Lewis (2000); v) an extensive pricing exercise under the rough Heston model of Jaisson and Rosenbaum (2015); vi) formulas to evaluate numerically the moments of a truncated density. The advantages of the SINC approach are numerous. When compared to benchmark methodologies, SINC provides the most accurate and fast pricing computation. The method naturally lends itself to price all options in a smile concurrently by means of Fast Fourier techniques, boosting fast calibration. Pricing requires to resort only to odd moments in the Fourier space. A previous version of this manuscript circulated with the title `Rough Heston: The SINC way.
We propose a novel approach to sentiment data filtering for a portfolio of assets. In our framework, a dynamic factor model drives the evolution of the observed sentiment and allows to identify two distinct components: a long-term component, modeled as a random walk, and a short-term component driven by a stationary VAR(1) process. Our model encompasses alternative approaches available in literature and can be readily estimated by means of Kalman filtering and expectation maximization. This feature makes it convenient when the cross-sectional dimension of the portfolio increases. By applying the model to a portfolio of Dow Jones stocks, we find that the long term component co-integrates with the market principal factor, while the short term one captures transient swings of the market associated with the idiosyncratic components and captures the correlation structure of returns. Using quantile regressions, we assess the significance of the contemporaneous and lagged explanatory power of sentiment on returns finding strong statistical evidence when extreme returns, especially negative ones, are considered. Finally, the lagged relation is exploited in a portfolio allocation exercise.
Motivated by the evidence that real-world networks evolve in time and may exhibit non-stationary features, we propose an extension of the Exponential Random Graph Models (ERGMs) accommodating the time variation of network parameters. Within the ERGM framework, a network realization is sampled from a static probability distribution defined parametrically in terms of network statistics. Inspired by the fast growing literature on Dynamic Conditional Score-driven models, in our approach, each parameter evolves according to an updating rule driven by the score of the conditional distribution. We demonstrate the flexibility of the score-driven ERGMs, both as data generating processes and as filters, and we prove the advantages of the dynamic version with respect to the static one. Our method captures dynamical network dependencies, that emerge from the data, and allows for a test discriminating between static or time-varying parameters. Finally, we corroborate our findings with the application to networks from real financial and political systems exhibiting non stationary dynamics.
The analysis of the intraday dynamics of correlations among high-frequency returns is challenging due to the presence of asynchronous trading and market microstructure noise. Both effects may lead to significant data reduction and may severely undere stimate correlations if traditional methods for low-frequency data are employed. We propose to model intraday log-prices through a multivariate local-level model with score-driven covariance matrices and to treat asynchronicity as a missing value problem. The main advantages of this approach are: (i) all available data are used when filtering correlations, (ii) market microstructure noise is taken into account, (iii) estimation is performed through standard maximum likelihood methods. Our empirical analysis, performed on 1-second NYSE data, shows that opening hours are dominated by idiosyncratic risk and that a market factor progressively emerges in the second part of the day. The method can be used as a nowcasting tool for high-frequency data, allowing to study the real-time response of covariances to macro-news announcements and to build intraday portfolios with very short optimization horizons.
We propose a methodology for filtering, smoothing and assessing parameter and filtering uncertainty in score-driven models. Our technique is based on a general representation of the Kalman filter and smoother recursions for linear Gaussian models in terms of the score of the conditional log-likelihood. We prove that, when data is generated by a nonlinear non-Gaussian state-space model, the proposed methodology results from a local expansion of the true filtering density. A formal characterization of the approximation error is provided. As shown in extensive Monte Carlo analyses, our methodology performs very similarly to exact simulation-based methods, while remaining computationally extremely simple. We illustrate empirically the advantages in employing score-driven models as approximate filters rather than purely predictive processes.
Modeling the impact of the order flow on asset prices is of primary importance to understand the behavior of financial markets. Part I of this paper reported the remarkable improvements in the description of the price dynamics which can be obtained w hen one incorporates the impact of past returns on the future order flow. However, impact models presented in Part I consider the order flow as an exogenous process, only characterized by its two-point correlations. This assumption seriously limits the forecasting ability of the model. Here we attempt to model directly the stream of discrete events with a so-called Mixture Transition Distribution (MTD) framework, introduced originally by Raftery (1985). We distinguish between price-changing and non price-changing events and combine them with the order sign in order to reduce the order flow dynamics to the dynamics of a four-state discrete random variable. The MTD represents a parsimonious approximation of a full high-order Markov chain. The new approach captures with adequate realism the conditional correlation functions between signed events for both small and large tick stocks and signature plots. From a methodological viewpoint, we discuss a novel and flexible way to calibrate a large class of MTD models with a very large number of parameters. In spite of this large number of parameters, an out-of-sample analysis confirms that the model does not overfit the data.
Market impact is a key concept in the study of financial markets and several models have been proposed in the literature so far. The Transient Impact Model (TIM) posits that the price at high frequency time scales is a linear combination of the signs of the past executed market orders, weighted by a so-called propagator function. An alternative description -- the History Dependent Impact Model (HDIM) -- assumes that the deviation between the realised order sign and its expected level impacts the price linearly and permanently. The two models, however, should be extended since prices are a priori influenced not only by the past order flow, but also by the past realisation of returns themselves. In this paper, we propose a two-event framework, where price-changing and non price-changing events are considered separately. Two-event propagator models provide a remarkable improvement of the description of the market impact, especially for large tick stocks, where the events of price changes are very rare and very informative. Specifically the extended approach captures the excess anti-correlation between past returns and subsequent order flow which is missing in one-event models. Our results document the superior performances of the HDIMs even though only in minor relative terms compared to TIMs. This is somewhat surprising, because HDIMs are well grounded theoretically, while TIMs are, strictly speaking, inconsistent.
We propose a novel algorithm which allows to sample paths from an underlying price process in a local volatility model and to achieve a substantial variance reduction when pricing exotic options. The new algorithm relies on the construction of a disc rete multinomial tree. The crucial feature of our approach is that -- in a similar spirit to the Brownian Bridge -- each random path runs backward from a terminal fixed point to the initial spot price. We characterize the tree in two alternative ways: in terms of the optimal grids originating from the Recursive Marginal Quantization algorithm and following an approach inspired by the finite difference approximation of the diffusions infinitesimal generator. We assess the reliability of the new methodology comparing the performance of both approaches and benchmarking them with competitor Monte Carlo methods.
We present a detailed analysis of interest rate derivatives valuation under credit risk and collateral modeling. We show how the credit and collateral extended valuation framework in Pallavicini et al (2011), and the related collateralized valuation measure, can be helpful in defining the key market rates underlying the multiple interest rate curves that characterize current interest rate markets. A key point is that spot Libor rates are to be treated as market primitives rather than being defined by no-arbitrage relationships. We formulate a consistent realistic dynamics for the different rates emerging from our analysis and compare the resulting model performances to simpler models used in the industry. We include the often neglected margin period of risk, showing how this feature may increase the impact of different rates dynamics on valuation. We point out limitations of multiple curve models with deterministic basis considering valuation of particularly sensitive products such as basis swaps. We stress that a proper wrong way risk analysis for such products requires a model with a stochastic basis and we show numerical results confirming this fact.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا