首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Hellinger distance as a penalized log likelihood
Authors:Ian R Harris  Ayanendranath Basu
Institution:Department of Mathematics , University of Texas at Austin , Austin, TX, 78712, USA
Abstract:The present paper studies the minimum Hellinger distance estimator by recasting it as the maximum likelihood estimator in a data driven modification of the model density. In the process, the Hellinger distance itself is expressed as a penalized log likelihood function. The penalty is the sum of the model probabilities over the non-observed values of the sample space. A comparison of the modified model density with the original data provides insights into the robustness of the minimum Hellinger distance estimator. Adjustments of the amount of penalty leads to a class of minimum penalized Hellinger distance estimators, some members of which perform substantially better than the minimum Hellinger distance estimator at the model for small samples, without compromising the robustness properties of the latter.
Keywords:Hellinger distance  Kullback-Leibler divergence  penalized log likelihood  robustness
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号