ﻻ يوجد ملخص باللغة العربية
The family of pairwise independently determined (PID) systems, i.e. those for which the independent joining is the only self joining with independent 2-marginals, is a class of systems for which the long standing open question by Rokhlin, of whether mixing implies mixing of all orders, has a positive answer. We show that in the class of weakly mixing PID one finds a positive answer for another long-standing open problem, whether the multiple ergodic averages begin{equation*} frac 1 Nsum_{n=0}^{N-1}f_1(T^nx)cdots f_d(T^{dn}x), quad Nto infty, end{equation*} almost surely converge.
We introduce random towers to study almost sure rates of correlation decay for random partially hyperbolic attractors. Using this framework, we obtain abstract results on almost sure exponential, stretched exponential and polynomial correlation decay
In this paper it is shown that every non-periodic ergodic system has two topologically weakly mixing, fully supported models: one is non-minimal but has a dense set of minimal points; and the other one is proximal. Also for independent interests, for
Let $(X, T)$ be a weakly mixing minimal system, $p_1, cdots, p_d$ be integer-valued generalized polynomials and $(p_1,p_2,cdots,p_d)$ be non-degenerate. Then there exists a residual subset $X_0$ of $X$ such that for all $xin X_0$ $${ (T^{p_1(n)}x, cd
We obtain estimates on the uniform convergence rate of the Birkhoff average of a continuous observable over torus translations and affine skew product toral transformations. The convergence rate depends explicitly on the modulus of continuity of the
We investigate the convergence and convergence rate of stochastic training algorithms for Neural Networks (NNs) that, over the years, have spawned from Dropout (Hinton et al., 2012). Modeling that neurons in the brain may not fire, dropout algorithms