Hellinger distance as a penalized log likelihood |
| |
Authors: | Ian R. Harris Ayanendranath Basu |
| |
Affiliation: | Department of Mathematics , University of Texas at Austin , Austin, TX, 78712, USA |
| |
Abstract: | The present paper studies the minimum Hellinger distance estimator by recasting it as the maximum likelihood estimator in a data driven modification of the model density. In the process, the Hellinger distance itself is expressed as a penalized log likelihood function. The penalty is the sum of the model probabilities over the non-observed values of the sample space. A comparison of the modified model density with the original data provides insights into the robustness of the minimum Hellinger distance estimator. Adjustments of the amount of penalty leads to a class of minimum penalized Hellinger distance estimators, some members of which perform substantially better than the minimum Hellinger distance estimator at the model for small samples, without compromising the robustness properties of the latter. |
| |
Keywords: | Hellinger distance Kullback-Leibler divergence penalized log likelihood robustness |
|
|