首页 | 本学科首页   官方微博 | 高级检索  
     


On MCMC sampling in hierarchical longitudinal models
Authors:Siddhartha Chib  Bradley P. Carlin
Affiliation:(1) Department of Dairy Science, University of Wisconsin, Madison, WI 53706, USA;(2) Department of Animal Sciences, University of Wisconsin, Madison, WI 53706, USA;(3) Department of Biostatistics and Medical Informatics, University of Wisconsin, Madison, WI 53706, USA;(4) Department of Animal and Aquacultural Sciences, Norweigian University of Life Sciences, 1432 ?s, Norway
Abstract:Markov chain Monte Carlo (MCMC) algorithms have revolutionized Bayesian practice. In their simplest form (i.e., when parameters are updated one at a time) they are, however, often slow to converge when applied to high-dimensional statistical models. A remedy for this problem is to block the parameters into groups, which are then updated simultaneously using either a Gibbs or Metropolis-Hastings step. In this paper we construct several (partially and fully blocked) MCMC algorithms for minimizing the autocorrelation in MCMC samples arising from important classes of longitudinal data models. We exploit an identity used by Chib (1995) in the context of Bayes factor computation to show how the parameters in a general linear mixed model may be updated in a single block, improving convergence and producing essentially independent draws from the posterior of the parameters of interest. We also investigate the value of blocking in non-Gaussian mixed models, as well as in a class of binary response data longitudinal models. We illustrate the approaches in detail with three real-data examples.
Keywords:Blocking  correlated binary data  convergence acceleration  Gibbs sampler  Metropolis-Hastings algorithm  linear mixed model  panel data  random effects
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号