首页 | 本学科首页   官方微博 | 高级检索  
     


Selection rules based on divergences
Authors:A. Berlinet
Affiliation:I3M, UMR CNRS 5149, University of Montpellier II , Place Bataillon, 34095, Montpellier, France
Abstract:This paper deals with a special adaptive estimation problem, namely how can one select for each set of i.i.d. data X 1, …, X n the better of two given estimates of the data-generating probability density. Such a problem was studied by Devroye and Lugosi [Combinatorial Methods in Density Estimation, Springer, Berlin, 2001] who proposed a feasible suboptimal selection (called the Scheffé selection) as an alternative to the optimal but nonfeasible selection which minimizes the L1-error. In many typical situations, the L1-error of the Scheffé selection was shown to tend to zero for n→∞ as fast as the L1-error of the optimal estimate. This asymptotic result was based on an inequality between the total variation errors of the Scheffé and optimal selections. The present paper extends this inequality to the class of φ-divergence errors containing the L1-error as a special case. The first extension compares the φ-divergence errors of the mentioned Scheffé and optimal selections of Devroye and Lugosi. The second extension deals with a class of generalized Scheffé selections adapted to the φ-divergence error criteria and reducing to the classical Scheffé selection for the L1-criterion. It compares the φ-divergence errors of these feasible selections and the optimal nonfeasible selections minimizing the φ-divergence errors. Both extensions are motivated and illustrated by examples.
Keywords:nonparametric estimation  divergence error criteria  optimal and suboptimal selections  Scheffé selection  divergence selections
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号