ترغب بنشر مسار تعليمي؟ اضغط هنا

Almost sure convergence of the multiple ergodic average for certain weakly mixing systems

113   0   0.0 ( 0 )
 نشر من قبل Song Shao
 تاريخ النشر 2016
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

The family of pairwise independently determined (PID) systems, i.e. those for which the independent joining is the only self joining with independent 2-marginals, is a class of systems for which the long standing open question by Rokhlin, of whether mixing implies mixing of all orders, has a positive answer. We show that in the class of weakly mixing PID one finds a positive answer for another long-standing open problem, whether the multiple ergodic averages begin{equation*} frac 1 Nsum_{n=0}^{N-1}f_1(T^nx)cdots f_d(T^{dn}x), quad Nto infty, end{equation*} almost surely converge.

قيم البحث

اقرأ أيضاً

We introduce random towers to study almost sure rates of correlation decay for random partially hyperbolic attractors. Using this framework, we obtain abstract results on almost sure exponential, stretched exponential and polynomial correlation decay rates. We then apply our results to small random perturbations of Axiom A attractors, small perturbations of derived from Anosov partially hyperbolic systems and to solenoidal attractors with random intermittency.
In this paper it is shown that every non-periodic ergodic system has two topologically weakly mixing, fully supported models: one is non-minimal but has a dense set of minimal points; and the other one is proximal. Also for independent interests, for a given Kakutani-Rokhlin tower with relatively prime column heights, it is demonstrated how to get a new taller Kakutani-Rokhlin tower with same property, which can be used in Weisss proof of the Jewett-Kriegers theorem and the proofs of our theorems. Applications of the results are given.
Let $(X, T)$ be a weakly mixing minimal system, $p_1, cdots, p_d$ be integer-valued generalized polynomials and $(p_1,p_2,cdots,p_d)$ be non-degenerate. Then there exists a residual subset $X_0$ of $X$ such that for all $xin X_0$ $${ (T^{p_1(n)}x, cd ots, T^{p_d(n)}x): nin mathbb{Z}}$$ is dense in $X^d$.
We obtain estimates on the uniform convergence rate of the Birkhoff average of a continuous observable over torus translations and affine skew product toral transformations. The convergence rate depends explicitly on the modulus of continuity of the observable and on the arithmetic properties of the frequency defining the transformation. Furthermore, we show that for the one dimensional torus translation, these estimates are nearly optimal.
We investigate the convergence and convergence rate of stochastic training algorithms for Neural Networks (NNs) that, over the years, have spawned from Dropout (Hinton et al., 2012). Modeling that neurons in the brain may not fire, dropout algorithms consist in practice of multiplying the weight matrices of a NN component-wise by independently drawn random matrices with ${0,1}$-valued entries during each iteration of the Feedforward-Backpropagation algorithm. This paper presents a probability theoretical proof that for any NN topology and differentiable polynomially bounded activation functions, if we project the NNs weights into a compact set and use a dropout algorithm, then the weights converge to a unique stationary set of a projected system of Ordinary Differential Equations (ODEs). We also establish an upper bound on the rate of convergence of Gradient Descent (GD) on the limiting ODEs of dropout algorithms for arborescences (a class of trees) of arbitrary depth and with linear activation functions.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا