Do you want to publish a course? Click here

Approximation by finitely supported measures

111   0   0.0 ( 0 )
 Added by Benoit Kloeckner
 Publication date 2010
  fields
and research's language is English




Ask ChatGPT about the research

Given a compactly supported probability measure on a Riemannian manifold, we study the asymptotic speed at which it can be approximated (in Wasserstein distance of any exponent p) by finitely supported measure. This question has been studied under the names of ``quantization of distributions and, when p=1, ``location problem. When p=2, it is linked with Centroidal Voronoi Tessellations.



rate research

Read More

The theory of finitely supported algebraic structures represents a reformulation of Zermelo-Fraenkel set theory in which every construction is finitely supported according to the action of a group of permutations of some basic elements named atoms. In this paper we study the properties of finitely supported sets that contain infinite uniformly supported subsets, as well as the properties of finitely supported sets that do not contain infinite uniformly supported subsets. For classical atomic sets, we study whether they contain or not infinite uniformly supported subsets.
In this paper it is proved that if a minimal system has the property that its sequence entropy is uniformly bounded for all sequences, then it has only finitely many ergodic measures and is an almost finite to one extension of its maximal equicontinuous factor. This result is obtained as an application of a general criteria which states that if a minimal system is an almost finite to one extension of its maximal equicontinuous factor and has no infinite independent sets of length $k$ for some $kge 2$, then it has only finitely many ergodic measures.
163 - Huaxin Lin 2007
Let $X$ be a compact metric space and let $Lambda$ be a $Z^k$ ($kge 1$) action on $X.$ We give a solution to a version of Voiculescus problem of AF-embedding: The crossed product $C(X)rtimes_{Lambda}Z^k$ can be embedded into a unital simple AF-algebra if and only if $X$ admits a strictly positive $Lambda$-invariant Borel probability measure. Let $C$ be a unital AH-algebra, let $G$ be a finitely generated abelian group and let $Lambda: Gto Aut(C)$ be a monomorphism. We show that $Crtimes_{Lambda} G$ can be embedded into a unital simple AF-algebra if and only if $C$ admits a faithful $Lambda$-invariant tracial state.
We show that every finitely generated conical refinement monoid can be represented as the monoid $mathcal V(R)$ of isomorphism classes of finitely generated projective modules over a von Neumann regular ring $R$. To this end, we use the representation of these monoids provided by adaptable separated graphs. Given an adaptable separated graph $(E, C)$ and a field $K$, we build a von Neumann regular $K$-algebra $Q_K (E, C)$ and show that there is a natural isomorphism between the separated graph monoid $M(E, C)$ and the monoid $mathcal V(Q_K (E, C))$.
The study of universal approximation of arbitrary functions $f: mathcal{X} to mathcal{Y}$ by neural networks has a rich and thorough history dating back to Kolmogorov (1957). In the case of learning finite dimensional maps, many authors have shown various forms of the universality of both fixed depth and fixed width neural networks. However, in many cases, these classical results fail to extend to the recent use of approximations of neural networks with infinitely many units for functional data analysis, dynamical systems identification, and other applications where either $mathcal{X}$ or $mathcal{Y}$ become infinite dimensional. Two questions naturally arise: which infinite dimensional analogues of neural networks are sufficient to approximate any map $f: mathcal{X} to mathcal{Y}$, and when do the finite approximations to these analogues used in practice approximate $f$ uniformly over its infinite dimensional domain $mathcal{X}$? In this paper, we answer the open question of universal approximation of nonlinear operators when $mathcal{X}$ and $mathcal{Y}$ are both infinite dimensional. We show that for a large class of different infinite analogues of neural networks, any continuous map can be approximated arbitrarily closely with some mild topological conditions on $mathcal{X}$. Additionally, we provide the first lower-bound on the minimal number of input and output units required by a finite approximation to an infinite neural network to guarantee that it can uniformly approximate any nonlinear operator using samples from its inputs and outputs.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا