ﻻ يوجد ملخص باللغة العربية
In a regression setting with response vector $mathbf{y} in mathbb{R}^n$ and given regressor vectors $mathbf{x}_1,ldots,mathbf{x}_p in mathbb{R}^n$, a typical question is to what extent $mathbf{y}$ is related to these regressor vectors, specifically, how well can $mathbf{y}$ be approximated by a linear combination of them. Classical methods for this question are based on statistical models for the conditional distribution of $mathbf{y}$, given the regressor vectors $mathbf{x}_j$. Davies and Duembgen (2020) proposed a model-free approach in which all observation vectors $mathbf{y}$ and $mathbf{x}_j$ are viewed as fixed, and the quality of the least squares fit of $mathbf{y}$ is quantified by comparing it with the least squares fit resulting from $p$ independent white noise regressor vectors. The purpose of the present note is to explain in a general context why the model-based and model-free approach yield the same p-values, although the interpretation of the latter is different under the two paradigms.
We reexamine the classical linear regression model when the model is subject to two types of uncertainty: (i) some of covariates are either missing or completely inaccessible, and (ii) the variance of the measurement error is undetermined and changin
We study the performance of the Least Squares Estimator (LSE) in a general nonparametric regression model, when the errors are independent of the covariates but may only have a $p$-th moment ($pgeq 1$). In such a heavy-tailed regression setting, we s
The asymptotic optimality (a.o.) of various hyper-parameter estimators with different optimality criteria has been studied in the literature for regularized least squares regression problems. The estimators include e.g., the maximum (marginal) likeli
We study the parameter estimation problem of Vasicek Model driven by sub-fractional Brownian processes from discrete observations, and let {S_t^H,t>=0} denote a sub-fractional Brownian motion whose Hurst parameter 1/2<H<1 . The studies are as follows
We study the problem of exact support recovery based on noisy observations and present Refined Least Squares (RLS). Given a set of noisy measurement $$ myvec{y} = myvec{X}myvec{theta}^* + myvec{omega},$$ and $myvec{X} in mathbb{R}^{N times D}$ which