Linear regression methods impose strong constraints on regression models, especially on
the error terms where it assumes that it is independent and follows normal distribution, and
this may not be satisfied in many studies, leading to bias that can
not be ignored from the
actual model, which affects the credibility of the study.
We present in this paper the problem of estimating the regression function using the
Nadarya Watson kernel and k- nearest neighbor estimators as alternatives to the parametric
linear regression estimators through a simulation study on an imposed model, where we
conducted a comparative study between these methods using the statistical programming
language R in order to know the best of these estimations. Where the mean squares errors
(MSE) was used to determine the best estimate.
The results of the simulation study also indicate the effectiveness and efficiency of the
nonparametric in the representation of the regression function as compared to linear
regression estimators, and indicate the convergence of the performance of these two
estimates.