首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Shrinkage tuning parameter selection with a diverging number of parameters
Authors:Hansheng Wang  Bo Li  Chenlei Leng
Institution:Peking University, Beijing, People's Republic of China;
Tsinghua University, Beijing, People's Republic of China;
National University of Singapore, Singapore
Abstract:Summary.  Contemporary statistical research frequently deals with problems involving a diverging number of parameters. For those problems, various shrinkage methods (e.g. the lasso and smoothly clipped absolute deviation) are found to be particularly useful for variable selection. Nevertheless, the desirable performances of those shrinkage methods heavily hinge on an appropriate selection of the tuning parameters. With a fixed predictor dimension, Wang and co-worker have demonstrated that the tuning parameters selected by a Bayesian information criterion type criterion can identify the true model consistently. In this work, similar results are further extended to the situation with a diverging number of parameters for both unpenalized and penalized estimators. Consequently, our theoretical results further enlarge not only the scope of applicabilityation criterion type criteria but also that of those shrinkage estimation methods.
Keywords:Bayesian information criterion  Diverging number of parameters  Lasso  Smoothly clipped absolute deviation
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号