Do you want to publish a course? Click here

The mutual fund industry manages about a quarter of the assets in the U.S. stock market and thus plays an important role in the U.S. economy. The question of how much control is concentrated in the hands of the largest players is best quantitatively discussed in terms of the tail behavior of the mutual fund size distribution. We study the distribution empirically and show that the tail is much better described by a log-normal than a power law, indicating less concentration than, for example, personal income. The results are highly statistically significant and are consistent across fifteen years. This contradicts a recent theory concerning the origin of the power law tails of the trading volume distribution. Based on the analysis in a companion paper, the log-normality is to be expected, and indicates that the distribution of mutual funds remains perpetually out of equilibrium.
Is the large influence that mutual funds assert on the U.S. financial system spread across many funds, or is it is concentrated in only a few? We argue that the dominant economic factor that determines this is market efficiency, which dictates that fund performance is size independent and fund growth is essentially random. The random process is characterized by entry, exit and growth. We present a new time-dependent solution for the standard equations used in the industrial organization literature and show that relaxation to the steady-state solution is extremely slow. Thus, even if these processes were stationary (which they are not), the steady-state solution, which is a very heavy-tailed power law, is not relevant. The distribution is instead well-approximated by a less heavy-tailed log-normal. We perform an empirical analysis of the growth of mutual funds, propose a new, more accurate size-dependent model, and show that it makes a good prediction of the empirically observed size distribution. While mutual funds are in many respects like other firms, market efficiency introduces effects that make their growth process distinctly different. Our work shows that a simple model based on market efficiency provides a good explanation of the concentration of assets, suggesting that other effects, such as transaction costs or the behavioral aspects of investor choice, play a smaller role.
Phenomena as diverse as breeding bird populations, the size of U.S. firms, money invested in mutual funds, the GDP of individual countries and the scientific output of universities all show unusual but remarkably similar growth fluctuations. The fluctuations display characteristic features, including double exponential scaling in the body of the distribution and power law scaling of the standard deviation as a function of size. To explain this we propose a remarkably simple additive replication model: At each step each individual is replaced by a new number of individuals drawn from the same replication distribution. If the replication distribution is sufficiently heavy tailed then the growth fluctuations are Levy distributed. We analyze the data from bird populations, firms, and mutual funds and show that our predictions match the data well, in several respects: Our theory results in a much better collapse of the individual distributions onto a single curve and also correctly predicts the scaling of the standard deviation with size. To illustrate how this can emerge from a collective microscopic dynamics we propose a model based on stochastic influence dynamics over a scale-free contact network and show that it produces results similar to those observed. We also extend the model to deal with correlations between individual elements. Our main conclusion is that the universality of growth fluctuations is driven by the additivity of growth processes and the action of the generalized central limit theorem.
We study the costs of coal-fired electricity in the United States between 1882 and 2006 by decomposing it in terms of the price of coal, transportation costs, energy density, thermal efficiency, plant construction cost, interest rate, capacity factor, and operations and maintenance cost. The dominant determinants of costs have been the price of coal and plant construction cost. The price of coal appears to fluctuate more or less randomly while the construction cost follows long-term trends, decreasing from 1902 - 1970, increasing from 1970 - 1990, and leveling off since then. Our analysis emphasizes the importance of using long time series and comparing electricity generation technologies using decomposed total costs, rather than costs of single components like capital. By taking this approach we find that the history of coal-fired electricity costs suggests there is a fluctuating floor to its future costs, which is determined by coal prices. Even if construction costs resumed a decreasing trend, the cost of coal-based electricity would drop for a while but eventually be determined by the price of coal, which fluctuates while showing no long-term trend.
We build a simple model of leveraged asset purchases with margin calls. Investment funds use what is perhaps the most basic financial strategy, called value investing, i.e. systematically attempting to buy underpriced assets. When funds do not borrow, the price fluctuations of the asset are normally distributed and uncorrelated across time. All this changes when the funds are allowed to leverage, i.e. borrow from a bank, to purchase more assets than their wealth would otherwise permit. During good times competition drives investors to funds that use more leverage, because they have higher profits. As leverage increases price fluctuations become heavy tailed and display clustered volatility, similar to what is observed in real markets. Previous explanations of fat tails and clustered volatility depended on irrational behavior, such as trend following. Here instead this comes from the fact that leverage limits cause funds to sell into a falling market: A prudent bank makes itself locally safer by putting a limit to leverage, so when a fund exceeds its leverage limit, it must partially repay its loan by selling the asset. Unfortunately this sometimes happens to all the funds simultaneously when the price is already falling. The resulting nonlinear feedback amplifies large downward price movements. At the extreme this causes crashes, but the effect is seen at every time scale, producing a power law of price disturbances. A standard (supposedly more sophisticated) risk control policy in which individual banks base leverage limits on volatility causes leverage to rise during periods of low volatility, and to contract more quickly when volatility gets high, making these extreme fluctuations even worse.
We study a simple model for the evolution of the cost (or more generally the performance) of a technology or production process. The technology can be decomposed into $n$ components, each of which interacts with a cluster of $d-1$ other, dependent components. Innovation occurs through a series of trial-and-error events, each of which consists of randomly changing the cost of each component in a cluster, and accepting the changes only if the total cost of the entire cluster is lowered. We show that the relationship between the cost of the whole technology and the number of innovation attempts is asymptotically a power law, matching the functional form often observed for empirical data. The exponent $alpha$ of the power law depends on the intrinsic difficulty of finding better components, and on what we term the {it design complexity}: The more complex the design, the slower the rate of improvement. Letting $d$ as defined above be the connectivity, in the special case in which the connectivity is constant, the design complexity is simply the connectivity. When the connectivity varies, bottlenecks can arise in which a few components limit progress. In this case the design complexity is more complicated, depending on the details of the design. The number of bottlenecks also determines whether progress is steady, or whether there are periods of stasis punctuated by occasional large changes. Our model connects the engineering properties of a design to historical studies of technology improvement.
In this article we revisit the classic problem of tatonnement in price formation from a microstructure point of view, reviewing a recent body of theoretical and empirical work explaining how fluctuations in supply and demand are slowly incorporated into prices. Because revealed market liquidity is extremely low, large orders to buy or sell can only be traded incrementally, over periods of time as long as months. As a result order flow is a highly persistent long-memory process. Maintaining compatibility with market efficiency has profound consequences on price formation, on the dynamics of liquidity, and on the nature of impact. We review a body of theory that makes detailed quantitative predictions about the volume and time dependence of market impact, the bid-ask spread, order book dynamics, and volatility. Comparisons to data yield some encouraging successes. This framework suggests a novel interpretation of financial information, in which agents are at best only weakly informed and all have a similar and extremely noisy impact on prices. Most of the processed information appears to come from supply and demand itself, rather than from external news. The ideas reviewed here are relevant to market microstructure regulation, agent-based models, cost-optimal execution strategies, and understanding market ecologies.
The use of equilibrium models in economics springs from the desire for parsimonious models of economic phenomena that take human reasoning into account. This approach has been the cornerstone of modern economic theory. We explain why this is so, extolling the virtues of equilibrium theory; then we present a critique and describe why this approach is inherently limited, and why economics needs to move in new directions if it is to continue to make progress. We stress that this shouldnt be a question of dogma, but should be resolved empirically. There are situations where equilibrium models provide useful predictions and there are situations where they can never provide useful predictions. There are also many situations where the jury is still out, i.e., where so far they fail to provide a good description of the world, but where proper extensions might change this. Our goal is to convince the skeptics that equilibrium models can be useful, but also to make traditional economists more aware of the limitations of equilibrium models. We sketch some alternative approaches and discuss why they should play an important role in future research in economics.
We review of the interface between (theoretical) physics and information for non-experts. The origin of information as related to the notion of entropy is described, first in the context of thermodynamics then in the context of statistical mechanics. A close examination of the foundations of statistical mechanics and the need to reconcile the probabilistic and deterministic views of the world leads us to a discussion of chaotic dynamics, where information plays a crucial role in quantifying predictability. We then discuss a variety of fundamental issues that emerge in defining information and how one must exercise care in discussing concepts such as order, disorder, and incomplete knowledge. We also discuss an alternative form of entropy and its possible relevance for nonequilibrium thermodynamics. In the final part of the paper we discuss how quantum mechanics gives rise to the very different concept of quantum information. Entirely new possibilities for information storage and computation are possible due to the massive parallel processing inherent in quantum mechanics. We also point out how entropy can be extended to apply to quantum mechanics to provide a useful measurement for quantum entanglement. Finally we make a small excursion to the interface betweeen quantum theory and general relativity, where one is confronted with an ultimate information paradox posed by the physics of Black Holes. In this review we have limited ourselves; not all relevant topics that touch on physics and information could be covered.
This paper analyzes correlations in patterns of trading of different members of the London Stock Exchange. The collection of strategies associated with a member institution is defined by the sequence of signs of net volume traded by that institution in hour intervals. Using several methods we show that there are significant and persistent correlations between institutions. In addition, the correlations are structured into correlated and anti-correlated groups. Clustering techniques using the correlations as a distance metric reveal a meaningful clustering structure with two groups of institutions trading in opposite directions.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا