ترغب بنشر مسار تعليمي؟ اضغط هنا

Randomness on Computable Probability Spaces - A Dynamical Point of View

117   0   0.0 ( 0 )
 نشر من قبل Publications Loria
 تاريخ النشر 2009
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

We extend the notion of randomness (in the version introduced by Schnorr) to computable Probability Spaces and compare it to a dynamical notion of randomness: typicality. Roughly, a point is typical for some dynamic, if it follows the statistical behavior of the system (Birkhoffs pointwise ergodic theorem). We prove that a point is Schnorr random if and only if it is typical for every mixing computable dynamics. To prove the result we develop some tools for the theory of computable probability spaces (for example, morphisms) that are expected to have other applications.



قيم البحث

اقرأ أيضاً

466 - Zhiqi Bu , Shiyun Xu , Kan Chen 2020
When equipped with efficient optimization algorithms, the over-parameterized neural networks have demonstrated high level of performance even though the loss function is non-convex and non-smooth. While many works have been focusing on understanding the loss dynamics by training neural networks with the gradient descent (GD), in this work, we consider a broad class of optimization algorithms that are commonly used in practice. For example, we show from a dynamical system perspective that the Heavy Ball (HB) method can converge to global minimum on mean squared error (MSE) at a linear rate (similar to GD); however, the Nesterov accelerated gradient descent (NAG) may only converges to global minimum sublinearly. Our results rely on the connection between neural tangent kernel (NTK) and finite over-parameterized neural networks with ReLU activation, which leads to analyzing the limiting ordinary differential equations (ODE) for optimization algorithms. We show that, optimizing the non-convex loss over the weights corresponds to optimizing some strongly convex loss over the prediction error. As a consequence, we can leverage the classical convex optimization theory to understand the convergence behavior of neural networks. We believe our approach can also be extended to other optimization algorithms and network architectures.
We investigate computable metrizability of Polish spaces up to homeomorphism. In this paper we focus on Stone spaces. We use Stone duality to construct the first known example of a computable topological Polish space not homeomorphic to any computabl y metrized space. In fact, in our proof we construct a right-c.e. metrized Stone space which is not homeomorphic to any computably metrized space. Then we introduce a new notion of effective categoricity for effectively compact spaces and prove that effectively categorical Stone spaces are exactly the duals of computably categorical Boolean algebras. Finally, we prove that, for a Stone space $X$, the Banach space $C(X;mathbb{R})$ has a computable presentation if, and only if, $X$ is homeomorphic to a computably metrized space. This gives an unexpected positive partial answer to a question recently posed by McNicholl.
In this paper we investigate algorithmic randomness on more general spaces than the Cantor space, namely computable metric spaces. To do this, we first develop a unified framework allowing computations with probability measures. We show that any comp utable metric space with a computable probability measure is isomorphic to the Cantor space in a computable and measure-theoretic sense. We show that any computable metric space admits a universal uniform randomness test (without further assumption).
We define and study a probability monad on the category of complete metric spaces and short maps. It assigns to each space the space of Radon probability measures on it with finite first moment, equipped with the Kantorovich-Wasserstein distance. Thi s monad is analogous to the Giry monad on the category of Polish spaces, and it extends a construction due to van Breugel for compact and for 1-bounded complete metric spaces. We prove that this Kantorovich monad arises from a colimit construction on finite power-like constructions, which formalizes the intuition that probability measures are limits of finite samples. The proof relies on a criterion for when an ordinary left Kan extension of lax monoidal functors is a monoidal Kan extension. The colimit characterization allows the development of integration theory and the treatment of measures on spaces of measures, without measure theory. We also show that the category of algebras of the Kantorovich monad is equivalent to the category of closed convex subsets of Banach spaces with short affine maps as morphisms.
We present here a new and universal approach for the study of random and/or trees, unifying in one framework many different models, including some novel ones not yet understood in the literature. An and/or tree is a Boolean expression represented in (one of) its tree shapes. Fix an integer $k$, take a sequence of random (rooted) trees of increasing size, say $(t_n)_{nge 1}$, and label each of these random trees uniformly at random in order to get a random Boolean expression on $k$ variables. We prove that, under rather weak local conditions on the sequence of random trees $(t_n)_{nge 1}$, the distribution induced on Boolean functions by this procedure converges as $n$ tends to infinity. In particular, we characterise two different behaviours of this limit distribution depending on the shape of the local limit of $(t_n)_{nge 1}$: a degenerate case when the local limit has no leaves; and a non-degenerate case, which we are able to describe in more details under stronger conditions. In this latter case, we provide a relationship between the probability of a given Boolean function and its complexity. The examples covered by this unified framework include trees that interpolate between models with logarithmic typical distances (such as random binary search trees) and other ones with square root typical distances (such as conditioned Galton--Watson trees).
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا