首页 | 本学科首页   官方微博 | 高级检索  
     


A coordinate descent algorithm for computing penalized smooth quantile regression
Authors:Abdallah Mkhadri  Mohamed Ouhourane  Karim Oualkacha
Affiliation:1.Department of Mathematics,Cadi Ayyad University,Marrakech,Morocco;2.Department of Mathematics,Université du Québec à Montréal,Montreal,Canada
Abstract:The computation of penalized quantile regression estimates is often computationally intensive in high dimensions. In this paper we propose a coordinate descent algorithm for computing the penalized smooth quantile regression (cdaSQR) with convex and nonconvex penalties. The cdaSQR approach is based on the approximation of the objective check function, which is not differentiable at zero, by a modified check function which is differentiable at zero. Then, using the maximization-minimization trick of the gcdnet algorithm (Yang and Zou in, J Comput Graph Stat 22(2):396–415, 2013), we update each coefficient simply and efficiently. In our implementation, we consider the convex penalties (ell _1+ell _2) and the nonconvex penalties SCAD (or MCP) (+ ell _2). We establishe the convergence property of the csdSQR with (ell _1+ell _2) penalty. The numerical results show that our implementation is an order of magnitude faster than its competitors. Using simulations we compare the speed of our algorithm to its competitors. Finally, the performance of our algorithm is illustrated on three real data sets from diabetes, leukemia and Bardet–Bidel syndrome gene expression studies.
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号