ﻻ يوجد ملخص باللغة العربية
Due to the demand for tackling the problem of streaming data with high dimensional covariates, we propose an online sparse sliced inverse regression (OSSIR) method for online sufficient dimension reduction. The existing online sufficient dimension reduction methods focus on the case when the dimension $p$ is small. In this article, we show that our method can achieve better statistical accuracy and computation speed when the dimension $p$ is large. There are two important steps in our method, one is to extend the online principal component analysis to iteratively obtain the eigenvalues and eigenvectors of the kernel matrix, the other is to use the truncated gradient to achieve online $L_{1}$ regularization. We also analyze the convergence of the extended Candid covariance-free incremental PCA(CCIPCA) and our method. By comparing several existing methods in the simulations and real data applications, we demonstrate the effectiveness and efficiency of our method.
This work considers variational Bayesian inference as an inexpensive and scalable alternative to a fully Bayesian approach in the context of sparsity-promoting priors. In particular, the priors considered arise from scale mixtures of Normal distribut
Sliced inverse regression is one of the most popular sufficient dimension reduction methods. Originally, it was designed for independent and identically distributed data and recently extend to the case of serially and spatially dependent data. In thi
This article concerns the dimension reduction in regression for large data set. We introduce a new method based on the sliced inverse regression approach, called cluster-based regularized sliced inverse regression. Our method not only keeps the merit
We propose a new method for dimension reduction in regression using the first two inverse moments. We develop corresponding weighted chi-squared tests for the dimension of the regression. The proposed method considers linear combinations of Sliced In
Deterministic interpolation and quadrature methods are often unsuitable to address Bayesian inverse problems depending on computationally expensive forward mathematical models. While interpolation may give precise posterior approximations, determinis