首页 | 本学科首页   官方微博 | 高级检索  
     检索      


An identity for the Fisher information and Mahalanobis distance
Authors:Abram Kagan  Bing Li
Institution:1. Department of Mathematics, University of Maryland, College Park, MD 20742, USA;2. Department of Statistics, Pennsylvania State University, University Park, PA 16802, USA
Abstract:Consider a mixture problem consisting of k classes. Suppose we observe an s-dimensional random vector X   whose distribution is specified by the relations P(X∈A|Y=i)=Pi(A)P(XA|Y=i)=Pi(A), where Y   is an unobserved class identifier defined on {1,…,k}{1,,k}, having distribution P(Y=i)=piP(Y=i)=pi. Assuming the distributions PiPi having a common covariance matrix, elegant identities are presented that connect the matrix of Fisher information in Y   on the parameters p1,…,pkp1,,pk, the matrix of linear information in X, and the Mahalanobis distances between the pairs of P  's. Since the parameters are not free, the information matrices are singular and the technique of generalized inverses is used. A matrix extension of the Mahalanobis distance and its invariant forms are introduced that are of interest in their own right. In terms of parameter estimation, the results provide an independent of the parameter upper bound for the loss of accuracy by esimating p1,…,pkp1,,pk from a sample of XXs, as compared with the ideal estimator based on a random sample of YYs.
Keywords:Categorical random variables  Mixture models  Moore&ndash  Penrose inverse
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号