No Arabic abstract
For environmental problems such as global warming future costs must be balanced against present costs. This is traditionally done using an exponential function with a constant discount rate, which reduces the present value of future costs. The result is highly sensitive to the choice of discount rate and has generated a major controversy as to the urgency for immediate action. We study analytically several standard interest rate models from finance and compare their properties to empirical data. From historical time series for nominal interest rates and inflation covering 14 countries over hundreds of years, we find that extended periods of negative real interest rates are common, occurring in many epochs in all countries. This leads us to choose the Ornstein-Uhlenbeck model, in which real short run interest rates fluctuate stochastically and can become negative, even if they revert to a positive mean value. We solve the model in closed form and prove that the long-run discount rate is always less than the mean; indeed it can be zero or even negative, despite the fact that the mean short term interest rate is positive. We fit the parameters of the model to the data, and find that nine of the countries have positive long run discount rates while five have negative long-run discount rates. Even if one rejects the countries where hyperinflation has occurred, our results support the low discounting rate used in the Stern report over higher rates advocated by others.
The relationship between the size and the variance of firm growth rates is known to follow an approximate power-law behavior $sigma(S) sim S^{-beta(S)}$ where $S$ is the firm size and $beta(S)approx 0.2$ is an exponent weakly dependent on $S$. Here we show how a model of proportional growth which treats firms as classes composed of various number of units of variable size, can explain this size-variance dependence. In general, the model predicts that $beta(S)$ must exhibit a crossover from $beta(0)=0$ to $beta(infty)=1/2$. For a realistic set of parameters, $beta(S)$ is approximately constant and can vary in the range from 0.14 to 0.2 depending on the average number of units in the firm. We test the model with a unique industry specific database in which firm sales are given in terms of the sum of the sales of all their products. We find that the model is consistent with the empirically observed size-variance relationship.
Energy markets and the associated energy futures markets play a crucial role in global economies. We investigate the statistical properties of the recurrence intervals of daily volatility time series of four NYMEX energy futures, which are defined as the waiting times $tau$ between consecutive volatilities exceeding a given threshold $q$. We find that the recurrence intervals are distributed as a stretched exponential $P_q(tau)sim e^{(atau)^{-gamma}}$, where the exponent $gamma$ decreases with increasing $q$, and there is no scaling behavior in the distributions for different thresholds $q$ after the recurrence intervals are scaled with the mean recurrence interval $bartau$. These findings are significant under the Kolmogorov-Smirnov test and the Cram{e}r-von Mises test. We show that empirical estimations are in nice agreement with the numerical integration results for the occurrence probability $W_q(Delta{t}|t)$ of a next event above the threshold $q$ within a (short) time interval after an elapsed time $t$ from the last event above $q$. We also investigate the memory effects of the recurrence intervals. It is found that the conditional distributions of large and small recurrence intervals differ from each other and the conditional mean of the recurrence intervals scales as a power law of the preceding interval $bartau(tau_0)/bartau sim (tau_0/bartau)^beta$, indicating that the recurrence intervals have short-term correlations. Detrended fluctuation analysis and detrending moving average analysis further uncover that the recurrence intervals possess long-term correlations. We confirm that the clustering of the volatility recurrence intervals is caused by the long-term correlations well known to be present in the volatility.
The LIGO/Virgo gravitational--wave observatories have detected 50 BH-BH coalescences. This sample is large enough to have allowed several recent studies to draw conclusions about the branching ratios between isolated binaries versus dense stellar clusters as the origin of double BHs. It has also led to the exciting suggestion that the population is highly likely to contain primordial black holes. Here we demonstrate that such conclusions cannot yet be robust, because of the large current uncertainties in several key aspects of binary stellar evolution. These include the development and survival of a common envelope, the mass and angular momentum loss during binary interactions, mixing in stellar interiors, pair-instability mass loss and supernova outbursts. Using standard tools such as the population synthesis codes StarTrack and COMPAS and the detailed stellar evolution code MESA, we examine as a case study the possible future evolution of Melnick 34, the most massive known binary star system. We show that, despite its well-known orbital architecture, various assumptions regarding stellar and binary physics predict a wide variety of outcomes: from a close BH-BH binary (which would lead to a potentially detectable coalescence), through a wide BH-BH binary (which might be seen in microlensing observations), or a Thorne-Zytkow object, to a complete disruption of both objects by pair-instability supernovae. Thus since the future of massive binaries is inherently uncertain, sound predictions about the properties of BH-BH systems are highly challenging at this time. Consequently, drawing conclusions about the formation channels for the LIGO/Virgo BH-BH merger population is premature.
This paper analyzes correlations in patterns of trading of different members of the London Stock Exchange. The collection of strategies associated with a member institution is defined by the sequence of signs of net volume traded by that institution in hour intervals. Using several methods we show that there are significant and persistent correlations between institutions. In addition, the correlations are structured into correlated and anti-correlated groups. Clustering techniques using the correlations as a distance metric reveal a meaningful clustering structure with two groups of institutions trading in opposite directions.
The efficient market hypothesis has far-reaching implications for financial trading and market stability. Whether or not cryptocurrencies are informationally efficient has therefore been the subject of intense recent investigation. Here, we use permutation entropy and statistical complexity over sliding time-windows of price log returns to quantify the dynamic efficiency of more than four hundred cryptocurrencies. We consider that a cryptocurrency is efficient within a time-window when these two complexity measures are statistically indistinguishable from their values obtained on randomly shuffled data. We find that 37% of the cryptocurrencies in our study stay efficient over 80% of the time, whereas 20% are informationally efficient in less than 20% of the time. Our results also show that the efficiency is not correlated with the market capitalization of the cryptocurrencies. A dynamic analysis of informational efficiency over time reveals clustering patterns in which different cryptocurrencies with similar temporal patterns form four clusters, and moreover, younger currencies in each group appear poised to follow the trend of their elders. The cryptocurrency market thus already shows notable adherence to the efficient market hypothesis, although data also reveals that the coming-of-age of digital currencies is in this regard still very much underway.