首页 | 本学科首页   官方微博 | 高级检索  
     


Bayesian variable selection and estimation in maximum entropy quantile regression
Authors:Shiyi Tu  Min Wang  Xiaoqian Sun
Affiliation:1. Department of Mathematical Sciences, Clemson University, Clemson, SC, USA;2. Department of Mathematical Sciences, Michigan Technological University, Houghton, MI, USA
Abstract:Quantile regression has gained increasing popularity as it provides richer information than the regular mean regression, and variable selection plays an important role in the quantile regression model building process, as it improves the prediction accuracy by choosing an appropriate subset of regression predictors. Unlike the traditional quantile regression, we consider the quantile as an unknown parameter and estimate it jointly with other regression coefficients. In particular, we adopt the Bayesian adaptive Lasso for the maximum entropy quantile regression. A flat prior is chosen for the quantile parameter due to the lack of information on it. The proposed method not only addresses the problem about which quantile would be the most probable one among all the candidates, but also reflects the inner relationship of the data through the estimated quantile. We develop an efficient Gibbs sampler algorithm and show that the performance of our proposed method is superior than the Bayesian adaptive Lasso and Bayesian Lasso through simulation studies and a real data analysis.
Keywords:Maximum entropy quantile regression  Bayesian Lasso  Bayesian adaptive Lasso  mostprobable model  asymmetric Laplace distribution  Gibbs sampler
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号