ﻻ يوجد ملخص باللغة العربية
This article concerns the dimension reduction in regression for large data set. We introduce a new method based on the sliced inverse regression approach, called cluster-based regularized sliced inverse regression. Our method not only keeps the merit of considering both response and predictors information, but also enhances the capability of handling highly correlated variables. It is justified under certain linearity conditions. An empirical application on a macroeconomic data set shows that our method has outperformed the dynamic factor model and other shrinkage methods.
Sliced inverse regression is one of the most popular sufficient dimension reduction methods. Originally, it was designed for independent and identically distributed data and recently extend to the case of serially and spatially dependent data. In thi
Due to the demand for tackling the problem of streaming data with high dimensional covariates, we propose an online sparse sliced inverse regression (OSSIR) method for online sufficient dimension reduction. The existing online sufficient dimension re
We propose a new method for dimension reduction in regression using the first two inverse moments. We develop corresponding weighted chi-squared tests for the dimension of the regression. The proposed method considers linear combinations of Sliced In
We move beyond Is Machine Learning Useful for Macroeconomic Forecasting? by adding the how. The current forecasting literature has focused on matching specific variables and horizons with a particularly successful algorithm. In contrast, we study the
We present a model for generating probabilistic forecasts by combining kernel density estimation (KDE) and quantile regression techniques, as part of the probabilistic load forecasting track of the Global Energy Forecasting Competition 2014. The KDE