No Arabic abstract
The distribution of recurrence times or return intervals between extreme events is important to characterize and understand the behavior of physical systems and phenomena in many disciplines. It is well known that many physical processes in nature and society display long range correlations. Hence, in the last few years, considerable research effort has been directed towards studying the distribution of return intervals for long range correlated time series. Based on numerical simulations, it was shown that the return interval distributions are of stretched exponential type. In this paper, we obtain an analytical expression for the distribution of return intervals in long range correlated time series which holds good when the average return intervals are large. We show that the distribution is actually a product of power law and a stretched exponential form. We also discuss the regimes of validity and perform detailed studies on how the return interval distribution depends on the threshold used to define extreme events.
We investigate the probability distribution of the volatility return intervals $tau$ for the Chinese stock market. We rescale both the probability distribution $P_{q}(tau)$ and the volatility return intervals $tau$ as $P_{q}(tau)=1/bar{tau} f(tau/bar{tau})$ to obtain a uniform scaling curve for different threshold value $q$. The scaling curve can be well fitted by the stretched exponential function $f(x) sim e^{-alpha x^{gamma}}$, which suggests memory exists in $tau$. To demonstrate the memory effect, we investigate the conditional probability distribution $P_{q} (tau|tau_{0})$, the mean conditional interval $<tau|tau_{0}>$ and the cumulative probability distribution of the cluster size of $tau$. The results show clear clustering effect. We further investigate the persistence probability distribution $P_{pm}(t)$ and find that $P_{-}(t)$ decays by a power law with the exponent far different from the value 0.5 for the random walk, which further confirms long memory exists in $tau$. The scaling and long memory effect of $tau$ for the Chinese stock market are similar to those obtained from the United States and the Japanese financial markets.
Being able to predict the occurrence of extreme returns is important in financial risk management. Using the distribution of recurrence intervals---the waiting time between consecutive extremes---we show that these extreme returns are predictable on the short term. Examining a range of different types of returns and thresholds we find that recurrence intervals follow a $q$-exponential distribution, which we then use to theoretically derive the hazard probability $W(Delta t |t)$. Maximizing the usefulness of extreme forecasts to define an optimized hazard threshold, we indicates a financial extreme occurring within the next day when the hazard probability is greater than the optimized threshold. Both in-sample tests and out-of-sample predictions indicate that these forecasts are more accurate than a benchmark that ignores the predictive signals. This recurrence interval finding deepens our understanding of reoccurring extreme returns and can be applied to forecast extremes in risk management.
We construct a theoretical model for equilibrium distribution of workers across sectors with different labor productivity, assuming that a sector can accommodate a limited number of workers which depends only on its productivity. A general formula for such distribution of productivity is obtained, using the detail-balance condition necessary for equilibrium in the Ehrenfest-Brillouin model. We also carry out an empirical analysis on the average number of workers in given productivity sectors on the basis of an exhaustive dataset in Japan. The theoretical formula succeeds in explaining the two distinctive observational facts in a unified way, that is, a Boltzmann distribution with negative temperature on low-to-medium productivity side and a decreasing part in a power-law form on high productivity side.
Many natural and physical processes display long memory and extreme events. In these systems, the measured time series is invariably contaminated by noise. As the extreme events display large deviation from the mean behaviour, the noise does not affect the extreme events as much as it affects the typical values. Since the extreme events also carry the information about correlations in the full time series, they can be used to infer the correlation properties of the latter. In this work, from a given time series, we construct three modified time series using only the extreme events. It is shown that the correlations in the original time series and in the modified time series, as measured by the exponent obtained from detrended fluctuation analysis technique, are related to each other. Hence, the correlation exponents for a long memory time series can be inferred from its extreme events alone. This approach is demonstrated for several empirical time series.
Extreme events taking place on networks are not uncommon. We show that it is possible to manipulate the extreme events occurrence probabilities and its distribution over the nodes on scale-free networks by tuning the nodal capacity. This can be used to reduce the number of extreme events occurrences on a network. However monotonic nodal capacity enhancements, beyond a point, do not lead to any substantial reduction in the number of extreme events. We point out the practical implication of this result for network design in the context of reducing extreme events occurrences.