ترغب بنشر مسار تعليمي؟ اضغط هنا

Firm competition in a probabilistic framework of consumer choice

119   0   0.0 ( 0 )
 نشر من قبل Matus Medo
 تاريخ النشر 2013
  مجال البحث مالية فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We develop a probabilistic consumer choice framework based on information asymmetry between consumers and firms. This framework makes it possible to study market competition of several firms by both quality and price of their products. We find Nash market equilibria and other optimal strategies in various situations ranging from competition of two identical firms to firms of different sizes and firms which improve their efficiency.



قيم البحث

اقرأ أيضاً

157 - Alan Roncoroni , Matus Medo 2016
Models of spatial firm competition assume that customers are distributed in space and transportation costs are associated with their purchases of products from a small number of firms that are also placed at definite locations. It has been long known that the competition equilibrium is not guaranteed to exist if the most straightforward linear transportation costs are assumed. We show by simulations and also analytically that if periodic boundary conditions in two dimensions are assumed, the equilibrium exists for a pair of firms at any distance. When a larger number of firms is considered, we find that their total equilibrium profit is inversely proportional to the square root of the number of firms. We end with a numerical investigation of the systems behavior for a general transportation cost exponent.
We introduce a fully probabilistic framework of consumer product choice based on quality assessment. It allows us to capture many aspects of marketing such as partial information asymmetry, quality differentiation, and product placement in a supermarket.
We present a simple agent-based model to study the development of a bubble and the consequential crash and investigate how their proximate triggering factor might relate to their fundamental mechanism, and vice versa. Our agents invest according to t heir opinion on future price movements, which is based on three sources of information, (i) public information, i.e. news, (ii) information from their friendship network and (iii) private information. Our bounded rational agents continuously adapt their trading strategy to the current market regime by weighting each of these sources of information in their trading decision according to its recent predicting performance. We find that bubbles originate from a random lucky streak of positive news, which, due to a feedback mechanism of these news on the agents strategies develop into a transient collective herding regime. After this self-amplified exuberance, the price has reached an unsustainable high value, being corrected by a crash, which brings the price even below its fundamental value. These ingredients provide a simple mechanism for the excess volatility documented in financial markets. Paradoxically, it is the attempt for investors to adapt to the current market regime which leads to a dramatic amplification of the price volatility. A positive feedback loop is created by the two dominating mechanisms (adaptation and imitation) which, by reinforcing each other, result in bubbles and crashes. The model offers a simple reconciliation of the two opposite (herding versus fundamental) proposals for the origin of crashes within a single framework and justifies the existence of two populations in the distribution of returns, exemplifying the concept that crashes are qualitatively different from the rest of the price moves.
Todays consumer goods markets are rapidly evolving with significant growth in the number of information media as well as the number of competitive products. In this environment, obtaining a quantitative grasp of heterogeneous interactions of firms an d customers, which have attracted interest of management scientists and economists, requires the analysis of extremely high-dimensional data. Existing approaches in quantitative research could not handle such data without any reliable prior knowledge nor strong assumptions. Alternatively, we propose a novel method called complex Hilbert principal component analysis (CHPCA) and construct a synchronization network using Hodge decomposition. CHPCA enables us to extract significant comovements with a time lead/delay in the data, and Hodge decomposition is useful for identifying the time-structure of correlations. We apply this method to the Japanese beer market data and reveal comovement of variables related to the consumer choice process across multiple products. Furthermore, we find remarkable customer heterogeneity by calculating the coordinates of each customer in the space derived from the results of CHPCA. Lastly, we discuss the policy and managerial implications, limitations, and further development of the proposed method.
331 - Reginald D. Smith 2012
In this paper we analyze Greshams Law, in particular, how the rate of inflow or outflow of currencies is affected by the demand elasticity of arbitrage and the difference in face value ratios inside and outside of a country under a bimetallic system. We find that these equations are very similar to those used to describe drift in systems of free charged particles. In addition, we look at how Greshams Law would play out with multiple currencies and multiple countries under a variety of connecting topologies.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا