Abstract: | Set compound estimation has been studied for nearly half a century. This paper explores, for the first time, set compound estimation under entropy (Kullback-Leibler information) loss for a k -dimensional standard exponential family with a compact parameter space. It makes detailed investigation of entropy loss with the exponential family and related proper-ties. Asymptotically optimal set compound estimators with rates O ( n −1/2) under this loss are established for some discrete exponential families by using power series, representing the Bayes estimators in terms of a mixture density and applying the Singh-Datta Lemma. Poisson, negative binomial families and a two-dimensional model serve as examples. |