Do you want to publish a course? Click here

The unreasonable effectiveness of optimal transport in economics

74   0   0.0 ( 0 )
 Added by Alfred Galichon
 Publication date 2021
  fields Economy Financial
and research's language is English




Ask ChatGPT about the research

Optimal transport has become part of the standard quantitative economics toolbox. It is the framework of choice to describe models of matching with transfers, but beyond that, it allows to: extend quantile regression; identify discrete choice models; provide new algorithms for computing the random coefficient logit model; and generalize the gravity model in trade. This paper offer a brief review of the basics of the theory, its applications to economics, and some extensions.



rate research

Read More

In this paper, we first revisit the Koenker and Bassett variational approach to (univariate) quantile regression, emphasizing its link with latent factor representations and correlation maximization problems. We then review the multivariate extension due to Carlier et al. (2016, 2017) which relates vector quantile regression to an optimal transport problem with mean independence constraints. We introduce an entropic regularization of this problem, implement a gradient descent numerical method and illustrate its feasibility on univariate and bivariate examples.
149 - Tianyong Zhou 2021
The existing theorization of development economics and transition economics is probably inadequate and perhaps even flawed to accurately explain and analyze a dual economic system such as that in China. China is a country in the transition of dual structure and system. The reform of its economic system has brought off a long period of transformation. The allocation of factors is subjected to the dualistic regulation of planning or administration and market due to the dualistic system, and thus the signal distortion will be a commonly seen existence. From the perspective of balanced and safe growth, the institutional distortions of population birth, population flow, land transaction and housing supply, with the changing of export, may cause great influences on the production demand, which includes the iterative contraction of consumption, the increase of export competitive cost, the widening of urban-rural income gap, the transferring of residents income and the crowding out of consumption. In view of the worldwide shift from a conservative model with more income than expenditure to the debt-based model with more expenditure than income and the need for loose monetary policy, we must explore a basic model that includes variables of debt and land assets that affecting money supply and price changes, especially in China, where the current debt ratio is high and is likely to rise continuously. Based on such a logical framework of dualistic system economics and its analysis method, a preliminary calculation system is formed through the establishment of models.
Despite the widely-spread consensus on the brain complexity, sprouts of the single neuron revolution emerged in neuroscience in the 1970s. They brought many unexpected discoveries, including grandmother or concept cells and sparse coding of information in the brain. In machine learning for a long time, the famous curse of dimensionality seemed to be an unsolvable problem. Nevertheless, the idea of the blessing of dimensionality becomes gradually more and more popular. Ensembles of non-interacting or weakly interacting simple units prove to be an effective tool for solving essentially multidimensional problems. This approach is especially useful for one-shot (non-iterative) correction of errors in large legacy artificial intelligence systems. These simplicity revolutions in the era of complexity have deep fundamental reasons grounded in geometry of multidimensional data spaces. To explore and understand these reasons we revisit the background ideas of statistical physics. In the course of the 20th century they were developed into the concentration of measure theory. New stochastic separation theorems reveal the fine structure of the data clouds. We review and analyse biological, physical, and mathematical problems at the core of the fundamental question: how can high-dimensional brain organise reliable and fast learning in high-dimensional world of data by simple tools? Two critical applications are reviewed to exemplify the approach: one-shot correction of errors in intellectual systems and emergence of static and associative memories in ensembles of single neurons.
Working with data in table form is usually considered a preparatory and tedious step in the sensemaking pipeline; a way of getting the data ready for more sophisticated visualization and analytical tools. But for many people, spreadsheets -- the quintessential table tool -- remain a critical part of their information ecosystem, allowing them to interact with their data in ways that are hidden or abstracted in more complex tools. This is particularly true for data workers: people who work with data as part of their job but do not identify as professional analysts or data scientists. We report on a qualitative study of how these workers interact with and reason about their data. Our findings show that data tables serve a broader purpose beyond data cleanup at the initial stage of a linear analytic flow: users want to see and get their hands on the underlying data throughout the analytics process, reshaping and augmenting it to support sensemaking. They reorganize, mark up, layer on levels of detail, and spawn alternatives within the context of the base data. These direct interactions and human-readable table representations form a rich and cognitively important part of building understanding of what the data mean and what they can do with it. We argue that interactive tables are an important visualization idiom in their own right; that the direct data interaction they afford offers a fertile design space for visual analytics; and that sense making can be enriched by more flexible human-data interaction than is currently supported in visual analytics tools.
165 - Roman Jackiw 1996
Quantum field theory offers physicists a tremendously wide range of application; it is both a language with which a vast variety of physical processes can be discussed and also it provides a model for fundamental physics, the so-called ``standard-model, which thus far has passed every experimental test. No other framework exists in which one can calculate so many phenomena with such ease and accuracy. Nevertheless, today some physicists have doubts about quantum field theory, and here I want to examine these reservations.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا