Do you want to publish a course? Click here

Breaking the curse of dimension in multi-marginal Kantorovich optimal transport on finite state spaces

80   0   0.0 ( 0 )
 Added by Gero Friesecke
 Publication date 2017
  fields Physics
and research's language is English




Ask ChatGPT about the research

We present a new ansatz space for the general symmetric multi-marginal Kantorovich optimal transport problem on finite state spaces which reduces the number of unknowns from $tbinom{N+ell-1}{ell-1}$ to $ellcdot(N+1)$, where $ell$ is the number of marginal states and $N$ the number of marginals. The new ansatz space is a careful low-dimensional enlargement of the Monge class, which corresponds to $ellcdot(N-1)$ unknowns, and cures the insufficiency of the Monge ansatz, i.e. we show that the Kantorovich problem always admits a minimizer in the enlarged class, for arbitrary cost functions. Our results apply, in particular, to the discretization of multi-marginal optimal transport with Coulomb cost in three dimensions, which has received much recent interest due to its emergence as the strongly correlated limit of Hohenberg-Kohn density functional theory. In this context $N$ corresponds to the number of particles, motivating the interest in large $N$.



rate research

Read More

126 - Gero Friesecke 2018
It is known from clever mathematical examples cite{Ca10} that the Monge ansatz may fail in continuous two-marginal optimal transport (alias optimal coupling alias optimal assignment) problems. Here we show that this effect already occurs for finite assignment problems with $N=3$ marginals, $ell=3$ sites, and symmetric pairwise costs, with the values for $N$ and $ell$ both being optimal. Our counterexample is a transparent consequence of the convex geometry of the set of symmetric Kantorovich plans for $N=ell=3$, which -- as we show -- possess 22 extreme points, only 7 of which are Monge. These extreme points have a simple physical meaning as irreducible molecular packings, and the example corresponds to finding the minimum energy packing for Frenkel-Kontorova interactions. Our finite example naturally gives rise, by superposition, to a continuous one, where failure of the Monge ansatz manifests itself as nonattainment and formation of microstructure.
We study multi-marginal optimal transport, a generalization of optimal transport that allows us to define discrepancies between multiple measures. It provides a framework to solve multi-task learning problems and to perform barycentric averaging. However, multi-marginal distances between multiple measures are typically challenging to compute because they require estimating a transport plan with $N^P$ variables. In this paper, we address this issue in the following way: 1) we efficiently solve the one-dimensional multi-marginal Monge-Wasserstein problem for a classical cost function in closed form, and 2) we propose a higher-dimensional multi-marginal discrepancy via slicing and study its generalized metric properties. We show that computing the sliced multi-marginal discrepancy is massively scalable for a large number of probability measures with support as large as $10^7$ samples. Our approach can be applied to solving problems such as barycentric averaging, multi-task density estimation and multi-task reinforcement learning.
We formulate and solve a regression problem with time-stamped distributional data. Distributions are considered as points in the Wasserstein space of probability measures, metrized by the 2-Wasserstein metric, and may represent images, power spectra, point clouds of particles, and so on. The regression seeks a curve in the Wasserstein space that passes closest to the dataset. Our regression problem allows utilizing general curves in a Euclidean setting (linear, quadratic, sinusoidal, and so on), lifted to corresponding measure-valued curves in the Wasserstein space. It can be cast as a multi-marginal optimal transport problem that allows efficient computation. Illustrative academic examples are presented.
We introduce a simple, accurate, and extremely efficient method for numerically solving the multi-marginal optimal transport (MMOT) problems arising in density functional theory. The method relies on (i) the sparsity of optimal plans [for $N$ marginals discretized by $ell$ gridpoints each, general Kantorovich plans require $ell^N$ gridpoints but the support of optimizers is of size $O(ellcdot N)$ [FV18]], (ii) the method of column generation (CG) from discrete optimization which to our knowledge has not hitherto been used in MMOT, and (iii) ideas from machine learning. The well-known bottleneck in CG consists in generating new candidate columns efficiently; we prove that in our context, finding the best new column is an NP-complete problem. To overcome this bottleneck we use a genetic learning method tailormade for MMOT in which the dual state within CG plays the role of an adversary, in loose similarity to Wasserstein GANs. On a sequence of benchmark problems with up to 120 gridpoints and up to 30 marginals, our method always found the exact optimizers. Moreover, empirically the number of computational steps needed to find them appears to scale only polynomially when both $N$ and $ell$ are simultaneously increased (while keeping their ratio fixed to mimic a thermodynamic limit of the particle system).
We discuss a new notion of distance on the space of finite and nonnegative measures which can be seen as a generalization of the well-known Kantorovich-Wasserstein distance. The new distance is based on a dynamical formulation given by an Onsager operator that is the sum of a Wasserstein diffusion part and an additional reaction part describing the generation and absorption of mass. We present a full characterization of the distance and its properties. In fact the distance can be equivalently described by an optimal transport problem on the cone space over the underlying metric space. We give a construction of geodesic curves and discuss their properties.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا