ﻻ يوجد ملخص باللغة العربية
In this work we investigate the variation of the online kernelized ridge regression algorithm in the setting of $d-$dimensional adversarial nonparametric regression. We derive the regret upper bounds on the classes of Sobolev spaces $W_{p}^{beta}(mathcal{X})$, $pgeq 2, beta>frac{d}{p}$. The upper bounds are supported by the minimax regret analysis, which reveals that in the cases $beta> frac{d}{2}$ or $p=infty$ these rates are (essentially) optimal. Finally, we compare the performance of the kernelized ridge regression forecaster to the known non-parametric forecasters in terms of the regret rates and their computational complexity as well as to the excess risk rates in the setting of statistical (i.i.d.) nonparametric regression.
The goal of regression is to recover an unknown underlying function that best links a set of predictors to an outcome from noisy observations. In non-parametric regression, one assumes that the regression function belongs to a pre-specified infinite
In this paper we revisit the classical problem of nonparametric regression, but impose local differential privacy constraints. Under such constraints, the raw data $(X_1,Y_1),ldots,(X_n,Y_n)$, taking values in $mathbb{R}^d times mathbb{R}$, cannot be
In this paper, we study the properties of robust nonparametric estimation using deep neural networks for regression models with heavy tailed error distributions. We establish the non-asymptotic error bounds for a class of robust nonparametric regress
Distributed computing offers a high degree of flexibility to accommodate modern learning constraints and the ever increasing size of datasets involved in massive data issues. Drawing inspiration from the theory of distributed computation models devel
We derive generalization error bounds for traditional time-series forecasting models. Our results hold for many standard forecasting tools including autoregressive models, moving average models, and, more generally, linear state-space models. These n