Abstract: | ![]() Mutual information (also known as Kullback–Leibler divergence) can be viewed as a measure of multivariate association in a random vector. The definition incorporates the joint density as well as the marginal densities. We will focus on a representation of mutual information in terms of copula densities that is thus independent of the marginal distributions. This representation yields a different approach to estimating mutual information than the original definition does, as only the copula density has to be estimated. We review analytical properties and examples for selected distributions and discuss methods of nonparametric estimation of copula densities and hence of the mutual information from a sample. Based on a simulation study, we compare the performance of these estimators with respect to bias, standard deviation, and the root mean squared error. The Gauss and the Frank copula are considered as examples. |