General Sparse Boosting: Improving Feature Selection of L2 Boosting by Correlation-Based Penalty Family |
| |
Authors: | Junlong Zhao |
| |
Affiliation: | School of Mathematics and System Science, Beihang University, Beijing China |
| |
Abstract: | In high-dimensional setting, componentwise L2boosting has been used to construct sparse model that performs well, but it tends to select many ineffective variables. Several sparse boosting methods, such as, SparseL2Boosting and Twin Boosting, have been proposed to improve the variable selection of L2boosting algorithm. In this article, we propose a new general sparse boosting method (GSBoosting). The relations are established between GSBoosting and other well known regularized variable selection methods in the orthogonal linear model, such as adaptive Lasso, hard thresholds, etc. Simulation results show that GSBoosting has good performance in both prediction and variable selection. |
| |
Keywords: | Adaptive Lasso Boosting algorithm Model selection Sparsity |
|
|