ﻻ يوجد ملخص باللغة العربية
This paper establishes a global bias-correction divide-and-conquer (GBC-DC) rule for biased estimation under the case of memory constraint. In order to introduce the new estimation, a closed representation of the local estimators obtained by the data in each batch is adopted, aiming to formulate a pro forma linear regression between the local estimators and the true parameter of interest. Least square method is then used within this framework to composite a global estimator of the parameter. Thus, the main advantage over the classical DC method is that the new GBC-DC method can absorb the information hidden in the statistical structure and the variables in each batch of data. Consequently, the resulting global estimator is strictly unbiased even if the local estimator has a non-negligible bias. Moreover, the global estimator is consistent, and even can achieve root-$n$ consistency, without the constraint on the number of batches. Another attractive feature of the new method is computationally simple and efficient, without use of any iterative algorithm and local bias-correction. Specifically, the proposed GBC-DC method applies to various biased estimations such as shrinkage-type estimation and nonparametric regression estimation. Detailed simulation studies demonstrate that the proposed GBC-DC approach is significantly bias-corrected, and the behavior is comparable with the full data estimation and is much better than the competitors.
Change-points are a routine feature of big data observed in the form of high-dimensional data streams. In many such data streams, the component series possess group structures and it is natural to assume that changes only occur in a small number of a
Missing data and confounding are two problems researchers face in observational studies for comparative effectiveness. Williamson et al. (2012) recently proposed a unified approach to handle both issues concurrently using a multiply-robust (MR) metho
We offer a survey of recent results on covariance estimation for heavy-tailed distributions. By unifying ideas scattered in the literature, we propose user-friendly methods that facilitate practical implementation. Specifically, we introduce element-
The variance of noise plays an important role in many change-point detection procedures and the associated inferences. Most commonly used variance estimators require strong assumptions on the true mean structure or normality of the error distribution
Sensitivity indices when the inputs of a model are not independent are estimated by local polynomial techniques. Two original estimators based on local polynomial smoothers are proposed. Both have good theoretical properties which are exhibited and a