Do you want to publish a course? Click here

Estimation in Dirichlet random effects models

429   0   0.0 ( 0 )
 Added by George Casella
 Publication date 2010
and research's language is English




Ask ChatGPT about the research

We develop a new Gibbs sampler for a linear mixed model with a Dirichlet process random effect term, which is easily extended to a generalized linear mixed model with a probit link function. Our Gibbs sampler exploits the properties of the multinomial and Dirichlet distributions, and is shown to be an improvement, in terms of operator norm and efficiency, over other commonly used MCMC algorithms. We also investigate methods for the estimation of the precision parameter of the Dirichlet process, finding that maximum likelihood may not be desirable, but a posterior mode is a reasonable approach. Examples are given to show how these models perform on real data. Our results complement both the theoretical basis of the Dirichlet process nonparametric prior and the computational work that has been done to date.



rate research

Read More

205 - Benzion Boukai , Yue Zhang 2019
We consider a re-sampling scheme for estimation of the population parameters in the mixed effects nonlinear regression models of the type use for example in clinical pharmacokinetics, say. We provide an estimation procedure which {it recycles}, via random weighting, the relevant two-stage parameters estimates to construct consistent estimates of the sampling distribution of the various estimates. We establish the asymptotic consistency and asymptotic normality of the resampled estimates and demonstrate the applicability of the {it recycling} approach in a small simulation study and via example.
160 - Mohammad Arashi 2012
In this paper, we are basically discussing on a class of Baranchik type shrinkage estimators of the vector parameter in a location model, with errors belonging to a sub-class of elliptically contoured distributions. We derive conditions under Schwartz space in which the underlying class of shrinkage estimators outperforms the sample mean. Sufficient conditions on dominant class to outperform the usual James-Stein estimator are also established. It is nicely presented that the dominant properties of the class of estimators are robust truly respect to departures from normality.
Consider a Poisson point process with unknown support boundary curve $g$, which forms a prototype of an irregular statistical model. We address the problem of estimating non-linear functionals of the form $int Phi(g(x)),dx$. Following a nonparametric maximum-likelihood approach, we construct an estimator which is UMVU over Holder balls and achieves the (local) minimax rate of convergence. These results hold under weak assumptions on $Phi$ which are satisfied for $Phi(u)=|u|^p$, $pge 1$. As an application, we consider the problem of estimating the $L^p$-norm and derive the minimax separation rates in the corresponding nonparametric hypothesis testing problem. Structural differences to results for regular nonparametric models are discussed.
We study a problem of estimation of smooth functionals of parameter $theta $ of Gaussian shift model $$ X=theta +xi, theta in E, $$ where $E$ is a separable Banach space and $X$ is an observation of unknown vector $theta$ in Gaussian noise $xi$ with zero mean and known covariance operator $Sigma.$ In particular, we develop estimators $T(X)$ of $f(theta)$ for functionals $f:Emapsto {mathbb R}$ of Holder smoothness $s>0$ such that $$ sup_{|theta|leq 1} {mathbb E}_{theta}(T(X)-f(theta))^2 lesssim Bigl(|Sigma| vee ({mathbb E}|xi|^2)^sBigr)wedge 1, $$ where $|Sigma|$ is the operator norm of $Sigma,$ and show that this mean squared error rate is minimax optimal at least in the case of standard Gaussian shift model ($E={mathbb R}^d$ equipped with the canonical Euclidean norm, $xi =sigma Z,$ $Zsim {mathcal N}(0;I_d)$). Moreover, we determine a sharp threshold on the smoothness $s$ of functional $f$ such that, for all $s$ above the threshold, $f(theta)$ can be estimated efficiently with a mean squared error rate of the order $|Sigma|$ in a small noise setting (that is, when ${mathbb E}|xi|^2$ is small). The construction of efficient estimators is crucially based on a bootstrap chain method of bias reduction. The results could be applied to a variety of special high-dimensional and infinite-dimensional Gaussian models (for vector, matrix and functional data).
79 - Emilien Joly 2016
We study the problem of estimating the mean of a multivariatedistribution based on independent samples. The main result is the proof of existence of an estimator with a non-asymptotic sub-Gaussian performance for all distributions satisfying some mild moment assumptions.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا