首页 | 本学科首页   官方微博 | 高级检索  
     检索      


[HDDA] sparse subspace constrained partial least squares
Authors:Matthew Sutton  Kerrie Mengersen  Benoit Liquet
Institution:1. School of Mathematical Sciences, ARC Centre of Excellence for Mathematical and Statistical Frontiers, Queensland University of Technology, Brisbane, Australiamatt.sutton@qut.edu.au;3. School of Mathematical Sciences, ARC Centre of Excellence for Mathematical and Statistical Frontiers, Queensland University of Technology, Brisbane, Australia;4. Laboratory of Mathematics and Their Applications, University of Pau and Pays de lAdour, Pau, France
Abstract:ABSTRACT

In this paper, we investigate the objective function and deflation process for sparse Partial Least Squares (PLS) regression with multiple components. While many have considered variations on the objective for sparse PLS, the deflation process for sparse PLS has not received as much attention. Our work highlights a flaw in the Statistically Inspired Modification of Partial Least Squares (SIMPLS) deflation method when applied in sparse PLS regression. We also consider the Nonlinear Iterative Partial Least Squares (NIPALS) deflation in sparse PLS regression. To remedy the flaw in the SIMPLS method, we propose a new sparse PLS method wherein the direction vectors are constrained to be sparse and lie in a chosen subspace. We give insight into this new PLS procedure and show through examples and simulation studies that the proposed technique can outperform alternative sparse PLS techniques in coefficient estimation. Moreover, our analysis reveals a simple renormalization step that can be used to improve the estimation of sparse PLS direction vectors generated using any convex relaxation method.
Keywords:Deflation  Lasso  Partial Least Squares  penalized matrix decomposition
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号