首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Mutual information as a measure of multivariate association: analytical properties and statistical estimation
Abstract:Mutual information (also known as Kullback–Leibler divergence) can be viewed as a measure of multivariate association in a random vector. The definition incorporates the joint density as well as the marginal densities. We will focus on a representation of mutual information in terms of copula densities that is thus independent of the marginal distributions. This representation yields a different approach to estimating mutual information than the original definition does, as only the copula density has to be estimated. We review analytical properties and examples for selected distributions and discuss methods of nonparametric estimation of copula densities and hence of the mutual information from a sample. Based on a simulation study, we compare the performance of these estimators with respect to bias, standard deviation, and the root mean squared error. The Gauss and the Frank copula are considered as examples.
Keywords:mutual information  Kullback–Leibler divergence  copula density  nonparametric estimation  histogram  nearest neighbour estimator  Bernstein estimator  Beta kernel  Monte Carlo simulation
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号