ﻻ يوجد ملخص باللغة العربية
This paper develops a bias correction scheme for a multivariate normal model under a general parameterization. In the model, the mean vector and the covariance matrix share the same parameters. It includes many important regression models available in the literature as special cases, such as (non)linear regression, errors-in-variables models, and so forth. Moreover, heteroscedastic situations may also be studied within our framework. We derive a general expression for the second-order biases of maximum likelihood estimates of the model parameters and show that it is always possible to obtain the second order bias by means of ordinary weighted lest-squares regressions. We enlighten such general expression with an errors-in-variables model and also conduct some simulations in order to verify the performance of the corrected estimates. The simulation results show that the bias correction scheme yields nearly unbiased estimators. We also present an empirical ilustration.
Gaussian graphical models (GGMs) are well-established tools for probabilistic exploration of dependence structures using precision matrices. We develop a Bayesian method to incorporate covariate information in this GGMs setup in a nonlinear seemingly
This paper presents a new approach to a robust Gaussian process (GP) regression. Most existing approaches replace an outlier-prone Gaussian likelihood with a non-Gaussian likelihood induced from a heavy tail distribution, such as the Laplace distribu
We propose a nested reduced-rank regression (NRRR) approach in fitting regression model with multivariate functional responses and predictors, to achieve tailored dimension reduction and facilitate interpretation/visualization of the resulting functi
The problem of reducing the bias of maximum likelihood estimator in a general multivariate elliptical regression model is considered. The model is very flexible and allows the mean vector and the dispersion matrix to have parameters in common. Many f
We propose a multivariate functional responses low rank regression model with possible high dimensional functional responses and scalar covariates. By expanding the slope functions on a set of sieve basis, we reconstruct the basis coefficients as a m