Do you want to publish a course? Click here

A note on optimal designs for estimating the slope of a polynomial regression

104   0   0.0 ( 0 )
 Added by Holger Dette
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

In this note we consider the optimal design problem for estimating the slope of a polynomial regression with no intercept at a given point, say z. In contrast to previous work, which considers symmetric design spaces we investigate the model on the interval $[0, a]$ and characterize those values of $z$, where an explicit solution of the optimal design is possible.



rate research

Read More

132 - Martin Wahl 2018
We analyse the prediction error of principal component regression (PCR) and prove non-asymptotic upper bounds for the corresponding squared risk. Under mild assumptions, we show that PCR performs as well as the oracle method obtained by replacing empirical principal components by their population counterparts. Our approach relies on upper bounds for the excess risk of principal component analysis.
284 - Huiming Zhang 2018
This short note is to point the reader to notice that the proof of high dimensional asymptotic normality of MLE estimator for logistic regression under the regime $p_n=o(n)$ given in paper: Maximum likelihood estimation in logistic regression models with a diverging number of covariates. Electronic Journal of Statistics, 6, 1838-1846. is wrong.
Supersaturated design (SSD) has received much recent interest because of its potential in factor screening experiments. In this paper, we provide equivalent conditions for two columns to be fully aliased and consequently propose methods for constructing $E(f_{mathrm{NOD}})$- and $chi^2$-optimal mixed-level SSDs without fully aliased columns, via equidistant designs and difference matrices. The methods can be easily performed and many new optimal mixed-level SSDs have been obtained. Furthermore, it is proved that the nonorthogonality between columns of the resulting design is well controlled by the source designs. A rather complete list of newly generated optimal mixed-level SSDs are tabulated for practical use.
In this paper we consider the linear regression model $Y =S X+varepsilon $ with functional regressors and responses. We develop new inference tools to quantify deviations of the true slope $S$ from a hypothesized operator $S_0$ with respect to the Hilbert--Schmidt norm $| S- S_0|^2$, as well as the prediction error $mathbb{E} | S X - S_0 X |^2$. Our analysis is applicable to functional time series and based on asymptotically pivotal statistics. This makes it particularly user friendly, because it avoids the choice of tuning parameters inherent in long-run variance estimation or bootstrap of dependent data. We also discuss two sample problems as well as change point detection. Finite sample properties are investigated by means of a simulation study. Mathematically our approach is based on a sequential version of the popular spectral cut-off estimator $hat S_N$ for $S$. It is well-known that the $L^2$-minimax rates in the functional regression model, both in estimation and prediction, are substantially slower than $1/sqrt{N}$ (where $N$ denotes the sample size) and that standard estimators for $S$ do not converge weakly to non-degenerate limits. However, we demonstrate that simple plug-in estimators - such as $| hat S_N - S_0 |^2$ for $| S - S_0 |^2$ - are $sqrt{N}$-consistent and its sequenti
In this paper, we propose two simple yet efficient computational algorithms to obtain approximate optimal designs for multi-dimensional linear regression on a large variety of design spaces. We focus on the two commonly used optimal criteria, $D$- and $A$-optimal criteria. For $D$-optimality, we provide an alternative proof for the monotonic convergence for $D$-optimal criterion and propose an efficient computational algorithm to obtain the approximate $D$-optimal design. We further show that the proposed algorithm converges to the $D$-optimal design, and then prove that the approximate $D$-optimal design converges to the continuous $D$-optimal design under certain conditions. For $A$-optimality, we provide an efficient algorithm to obtain approximate $A$-optimal design and conjecture the monotonicity of the proposed algorithm. Numerical comparisons suggest that the proposed algorithms perform well and they are comparable or superior to some existing algorithms.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا