首页 | 本学科首页   官方微博 | 高级检索  
     检索      


New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation
Authors:Chunming Zhang  Yuan Jiang  Zuofeng Shang
Institution:Department of Statistics, University of Wisconsin, Madison, WI 53706, USA
Abstract:In statistical learning, regression and classification concern different types of the output variables, and the predictive accuracy is quantified by different loss functions. This article explores new aspects of Bregman divergence (BD), a notion which unifies nearly all of the commonly used loss functions in regression and classification. The authors investigate the duality between BD and its generating function. They further establish, under the framework of BD, asymptotic consistency and normality of parametric and nonparametric regression estimators, derive the lower bound of their asymptotic covariance matrices, and demonstrate the role that parametric and nonparametric regression estimation play in the performance of classification procedures and related machine learning techniques. These theoretical results and new numerical evidence show that the choice of loss function affects estimation procedures, whereas has an asymptotically relatively negligible impact on classification performance. Applications of BD to statistical model building and selection with non‐Gaussian responses are also illustrated. The Canadian Journal of Statistics 37: 119‐139; 2009 © 2009 Statistical Society of Canada
Keywords:Asymptotic normality  Bayes optimal rule  consistency  local polynomial regression  loss function  prediction error  MSC 2000: Primary 62F12  62G20  secondary 62E20  60F99
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号