首页 | 本学科首页   官方微博 | 高级检索  
     


Maximum-likelihood estimation of the random-clumped multinomial model as a prototype problem for large-scale statistical computing
Authors:Andrew M. Raim  Matthias K. Gobbert  Nagaraj K. Neerchal  Jorge G. Morel
Affiliation:1. Department of Mathematics and Statistics, University of Maryland Baltimore County, Baltimore, MD, USAaraim1@umbc.edu;3. Department of Mathematics and Statistics, University of Maryland Baltimore County, Baltimore, MD, USA;4. Biometrics and Statistical Sciences Department, Procter &5. Gamble Company, Cincinnati, OH, USA
Abstract:
Numerical methods are needed to obtain maximum-likelihood estimates (MLEs) in many problems. Computation time can be an issue for some likelihoods even with modern computing power. We consider one such problem where the assumed model is a random-clumped multinomial distribution. We compute MLEs for this model in parallel using the Toolkit for Advanced Optimization software library. The computations are performed on a distributed-memory cluster with low latency interconnect. We demonstrate that for larger problems, scaling the number of processes improves wall clock time significantly. An illustrative example shows how parallel MLE computation can be useful in a large data analysis. Our experience with a direct numerical approach indicates that more substantial gains may be obtained by making use of the specific structure of the random-clumped model.
Keywords:parallel computing  maximum-likelihood estimation  mixture distribution  multinomial
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号