Smoothing spline Gaussian regression: more scalable computation via efficient approximation |
| |
Authors: | Young-Ju Kim Chong Gu |
| |
Institution: | Yale University, New Haven, USA.; Purdue University, West Lafayette, USA. |
| |
Abstract: | Summary. Smoothing splines via the penalized least squares method provide versatile and effective nonparametric models for regression with Gaussian responses. The computation of smoothing splines is generally of the order O ( n 3), n being the sample size, which severely limits its practical applicability. We study more scalable computation of smoothing spline regression via certain low dimensional approximations that are asymptotically as efficient. A simple algorithm is presented and the Bayes model that is associated with the approximations is derived, with the latter guiding the porting of Bayesian confidence intervals. The practical choice of the dimension of the approximating space is determined through simulation studies, and empirical comparisons of the approximations with the exact solution are presented. Also evaluated is a simple modification of the generalized cross-validation method for smoothing parameter selection, which to a large extent fixes the occasional undersmoothing problem that is suffered by generalized cross-validation. |
| |
Keywords: | Bayesian confidence interval Computation Generalized cross-validation Penalized least squares |
|
|