首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Flexible clustering via hidden hierarchical Dirichlet priors
Authors:Antonio Lijoi  Igor Prünster  Giovanni Rebaudo
Institution:1. Department of Decision Sciences and BIDSA, Bocconi University, Milan, Italy;2. Department of Statistics and Data Sciences, University of Texas at Austin, Austin, Texas, USA
Abstract:The Bayesian approach to inference stands out for naturally allowing borrowing information across heterogeneous populations, with different samples possibly sharing the same distribution. A popular Bayesian nonparametric model for clustering probability distributions is the nested Dirichlet process, which however has the drawback of grouping distributions in a single cluster when ties are observed across samples. With the goal of achieving a flexible and effective clustering method for both samples and observations, we investigate a nonparametric prior that arises as the composition of two different discrete random structures and derive a closed-form expression for the induced distribution of the random partition, the fundamental tool regulating the clustering behavior of the model. On the one hand, this allows to gain a deeper insight into the theoretical properties of the model and, on the other hand, it yields an MCMC algorithm for evaluating Bayesian inferences of interest. Moreover, we single out limitations of this algorithm when working with more than two populations and, consequently, devise an alternative more efficient sampling scheme, which as a by-product, allows testing homogeneity between different populations. Finally, we perform a comparison with the nested Dirichlet process and provide illustrative examples of both synthetic and real data.
Keywords:Bayesian nonparametrics  clustering  dependent random partitions  hierarchical Dirichlet process  mixture models  nested Dirichlet process  vectors of random probabilities
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号