Supervised Principal Component Regression for Functional Response with High Dimensional Predictors


Abstract in English

We propose a supervised principal component regression method for relating functional responses with high dimensional covariates. Unlike the conventional principal component analysis, the proposed method builds on a newly defined expected integrated residual sum of squares, which directly makes use of the association between functional response and predictors. Minimizing the integrated residual sum of squares gives the supervised principal components, which is equivalent to solving a sequence of nonconvex generalized Rayleigh quotient optimization problems and thus is computationally intractable. To overcome this computational challenge, we reformulate the nonconvex optimization problems into a simultaneous linear regression, with a sparse penalty added to deal with high dimensional predictors. Theoretically, we show that the reformulated regression problem recovers the same supervised principal subspace under suitable conditions. Statistically, we establish non-asymptotic error bounds for the proposed estimators. Numerical studies and an application to the Human Connectome Project lend further support.

Download