No Arabic abstract
The DebtRank algorithm has been increasingly investigated as a method to estimate the impact of shocks in financial networks, as it overcomes the limitations of the traditional default-cascade approaches. Here we formulate a dynamical microscopic theory of instability for financial networks by iterating balance sheet identities of individual banks and by assuming a simple rule for the transfer of shocks from borrowers to lenders. By doing so, we generalise the DebtRank formulation, both providing an interpretation of the effective dynamics in terms of basic accounting principles and preventing the underestimation of losses on certain network topologies. Depending on the structure of the interbank leverage matrix the dynamics is either stable, in which case the asymptotic state can be computed analytically, or unstable, meaning that at least one bank will default. We apply this framework to a dataset of the top listed European banks in the period 2008 - 2013. We find that network effects can generate an amplification of exogenous shocks of a factor ranging between three (in normal periods) and six (during the crisis) when we stress the system with a 0.5% shock on external (i.e. non-interbank) assets for all banks.
We consider a dynamical model of distress propagation on complex networks, which we apply to the study of financial contagion in networks of banks connected to each other by direct exposures. The model that we consider is an extension of the DebtRank algorithm, recently introduced in the literature. The mechanics of distress propagation is very simple: When a bank suffers a loss, distress propagates to its creditors, who in turn suffer losses, and so on. The original DebtRank assumes that losses are propagated linearly between connected banks. Here we relax this assumption and introduce a one-parameter family of non-linear propagation functions. As a case study, we apply this algorithm to a data-set of 183 European banks, and we study how the stability of the system depends on the non-linearity parameter under different stress-test scenarios. We find that the system is characterized by a transition between a regime where small shocks can be amplified and a regime where shocks do not propagate, and that the overall stability of the system increases between 2008 and 2013.
We develop a novel stress-test framework to monitor systemic risk in financial systems. The modular structure of the framework allows to accommodate for a variety of shock scenarios, methods to estimate interbank exposures and mechanisms of distress propagation. The main features are as follows. First, the framework allows to estimate and disentangle not only first-round effects (i.e. shock on external assets) and second-round effects (i.e. distress induced in the interbank network), but also third-round effects induced by possible fire sales. Second, it allows to monitor at the same time the impact of shocks on individual or groups of financial institutions as well as their vulnerability to shocks on counterparties or certain asset classes. Third, it includes estimates for loss distributions, thus combining network effects with familiar risk measures such as VaR and CVaR. Fourth, in order to perform robustness analyses and cope with incomplete data, the framework features a module for the generation of sets of networks of interbank exposures that are coherent with the total lending and borrowing of each bank. As an illustration, we carry out a stress-test exercise on a dataset of listed European banks over the years 2008-2013. We find that second-round and third-round effects dominate first-round effects, therefore suggesting that most current stress-test frameworks might lead to a severe underestimation of systemic risk.
The study deals with the assessment of risk measures for Health Plans in order to assess the Solvency Capital Requirement. For the estimation of the individual health care expenditure for several episode types, we suggest an original approach based on a three-part regression model. We propose three Generalized Linear Models (GLM) to assess claim counts, the allocation of each claim to a specific episode and the severity average expenditures respectively. One of the main practical advantages of our proposal is the reduction of the regression models compared to a traditional approach, where several two-part models for each episode types are requested. As most health plans require co-payments or co-insurance, considering at this stage the non-linearity condition of the reimbursement function, we adopt a Montecarlo simulation to assess the health plan costs. The simulation approach provides the probability distribution of the Net Asset Value of the Health Plan and the estimate of several risk measures.
The presence of non linear instruments is responsible for the emergence of non Gaussian features in the price changes distribution of realistic portfolios, even for Normally distributed risk factors. This is especially true for the benchmark Delta Gamma Normal model, which in general exhibits exponentially damped power law tails. We show how the knowledge of the model characteristic function leads to Fourier representations for two standard risk measures, the Value at Risk and the Expected Shortfall, and for their sensitivities with respect to the model parameters. We detail the numerical implementation of our formulae and we emphasizes the reliability and efficiency of our results in comparison with Monte Carlo simulation.
Current noisy intermediate-scale quantum (NISQ) devices are far from being fault-tolerant, however, they permit some limited applications. In this study, we use a hybrid quantum-classical reservoir computing model to forecast the CBOE volatility index (VIX) using the S&P500 (SPX) time-series. The NISQ component of our model is executed on IBMs 53 qubit Rochester chip. We encode the SPX values in the rotation angles and linearly combine the average spin of the six-qubit register to predict the value of VIX at next time step. Our results demonstrate a potential approach towards utilizing noisy quantum devices for non-linear time-series forecasting tasks.