Joint covariate selection and joint subspace selection for multiple classification problems |
| |
Authors: | Guillaume Obozinski Ben Taskar Michael I. Jordan |
| |
Affiliation: | 1.Department of Statistics,University of California at Berkeley,Berkeley,USA;2.Department of Computer and Information Science,University of Pennsylvania,Philadelphia,USA;3.Department of Statistics and Department of Electrical Engineering and Computer Science,University of California at Berkeley,Berkeley,USA |
| |
Abstract: | We address the problem of recovering a common set of covariates that are relevant simultaneously to several classification problems. By penalizing the sum of ℓ 2 norms of the blocks of coefficients associated with each covariate across different classification problems, similar sparsity patterns in all models are encouraged. To take computational advantage of the sparsity of solutions at high regularization levels, we propose a blockwise path-following scheme that approximately traces the regularization path. As the regularization coefficient decreases, the algorithm maintains and updates concurrently a growing set of covariates that are simultaneously active for all problems. We also show how to use random projections to extend this approach to the problem of joint subspace selection, where multiple predictors are found in a common low-dimensional subspace. We present theoretical results showing that this random projection approach converges to the solution yielded by trace-norm regularization. Finally, we present a variety of experimental results exploring joint covariate selection and joint subspace selection, comparing the path-following approach to competing algorithms in terms of prediction accuracy and running time. |
| |
Keywords: | |
本文献已被 SpringerLink 等数据库收录! |
|