首页 | 本学科首页   官方微博 | 高级检索  
     


On convergence of the EM algorithmand the Gibbs sampler
Authors:Sujit K. Sahu  Gareth O. Roberts
Affiliation:(1) Centre de Geostatistique, Ecole des Mines de Paris, 35 rue Saint Honoré, 77305 Fontainebleau, France;(2) Energy &; Environment Subdivision, Geological Survey of Canada, Calgary
Abstract:
In this article we investigate the relationship between the EM algorithm and the Gibbs sampler. We show that the approximate rate of convergence of the Gibbs sampler by Gaussian approximation is equal to that of the corresponding EM-type algorithm. This helps in implementing either of the algorithms as improvement strategies for one algorithm can be directly transported to the other. In particular, by running the EM algorithm we know approximately how many iterations are needed for convergence of the Gibbs sampler. We also obtain a result that under certain conditions, the EM algorithm used for finding the maximum likelihood estimates can be slower to converge than the corresponding Gibbs sampler for Bayesian inference. We illustrate our results in a number of realistic examples all based on the generalized linear mixed models.
Keywords:Gaussian distribution  Generalized linear mixed models  Markov chain Monte Carlo  Parameterization  Rate of convergence
本文献已被 SpringerLink 等数据库收录!
正在获取相似文献,请稍候...
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号