首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Full likelihood-based inference for modern population genetics data presents methodological and computational challenges. The problem is of considerable practical importance and has attracted recent attention, with the development of algorithms based on importance sampling (IS) and Markov chain Monte Carlo (MCMC) sampling. Here we introduce a new IS algorithm. The optimal proposal distribution for these problems can be characterized, and we exploit a detailed analysis of genealogical processes to develop a practicable approximation to it. We compare the new method with existing algorithms on a variety of genetic examples. Our approach substantially outperforms existing IS algorithms, with efficiency typically improved by several orders of magnitude. The new method also compares favourably with existing MCMC methods in some problems, and less favourably in others, suggesting that both IS and MCMC methods have a continuing role to play in this area. We offer insights into the relative advantages of each approach, and we discuss diagnostics in the IS framework.  相似文献   

2.
We present a versatile Monte Carlo method for estimating multidimensional integrals, with applications to rare-event probability estimation. The method fuses two distinct and popular Monte Carlo simulation methods—Markov chain Monte Carlo and importance sampling—into a single algorithm. We show that for some applied numerical examples the proposed Markov Chain importance sampling algorithm performs better than methods based solely on importance sampling or MCMC.  相似文献   

3.
Monte Carlo methods represent the de facto standard for approximating complicated integrals involving multidimensional target distributions. In order to generate random realizations from the target distribution, Monte Carlo techniques use simpler proposal probability densities to draw candidate samples. The performance of any such method is strictly related to the specification of the proposal distribution, such that unfortunate choices easily wreak havoc on the resulting estimators. In this work, we introduce a layered (i.e., hierarchical) procedure to generate samples employed within a Monte Carlo scheme. This approach ensures that an appropriate equivalent proposal density is always obtained automatically (thus eliminating the risk of a catastrophic performance), although at the expense of a moderate increase in the complexity. Furthermore, we provide a general unified importance sampling (IS) framework, where multiple proposal densities are employed and several IS schemes are introduced by applying the so-called deterministic mixture approach. Finally, given these schemes, we also propose a novel class of adaptive importance samplers using a population of proposals, where the adaptation is driven by independent parallel or interacting Markov chain Monte Carlo (MCMC) chains. The resulting algorithms efficiently combine the benefits of both IS and MCMC methods.  相似文献   

4.
In this paper, we introduce a Bayesian Analysis for the Block and Basu bivariate exponential distribution using Markov Chain Monte Carlo (MCMC) methods and considering lifetimes in presence of covariates and censored data. Posterior summaries of interest are obtained using the popular WinBUGS software. Numerical illustrations are introduced considering a medical data set related to the recurrence times of infection for kidney patients and a medical data set related to bone marrow transplantation for leukemia.  相似文献   

5.
In this paper, we propose a value-at-risk (VaR) estimation technique based on a new stochastic volatility model with leverage effect, nonconstant conditional mean and jump. In order to estimate the model parameters and latent state variables, we integrate the particle filter and adaptive Markov Chain Monte Carlo (MCMC) algorithms to develop a novel adaptive particle MCMC (A-PMCMC) algorithm. Comprehensive simulation experiments based on three stock indices and two foreign exchange time series show effectiveness of the proposed A-PMCMC algorithm and the VaR estimation technique.  相似文献   

6.
We consider importance sampling (IS) type weighted estimators based on Markov chain Monte Carlo (MCMC) targeting an approximate marginal of the target distribution. In the context of Bayesian latent variable models, the MCMC typically operates on the hyperparameters, and the subsequent weighting may be based on IS or sequential Monte Carlo (SMC), but allows for multilevel techniques as well. The IS approach provides a natural alternative to delayed acceptance (DA) pseudo-marginal/particle MCMC, and has many advantages over DA, including a straightforward parallelization and additional flexibility in MCMC implementation. We detail minimal conditions which ensure strong consistency of the suggested estimators, and provide central limit theorems with expressions for asymptotic variances. We demonstrate how our method can make use of SMC in the state space models context, using Laplace approximations and time-discretized diffusions. Our experimental results are promising and show that the IS-type approach can provide substantial gains relative to an analogous DA scheme, and is often competitive even without parallelization.  相似文献   

7.
Markov Chain Monte Carlo (MCMC) is the most common method used in multiple imputation. However, it is not unbiased when it is applied to imputations of categorical variables. The literature has considered the problem for binary variables with only two levels. In this article, we consider more general situations. We not only evaluate the bias associated with the imputation of categorical variables using the MCMC method, but also introduce a method to correct the bias. A simulation study is conducted and an application is provided to demonstrate the advantages of using the correction factors proposed in this article.  相似文献   

8.
In this paper, we use Markov Chain Monte Carlo (MCMC) methods in order to estimate and compare stochastic production frontier models from a Bayesian perspective. We consider a number of competing models in terms of different production functions and the distribution of the asymmetric error term. All MCMC simulations are done using the package JAGS (Just Another Gibbs Sampler), a clone of the classic BUGS package which works closely with the R package where all the statistical computations and graphics are done.  相似文献   

9.
10.
Abstract

Frailty models are used in survival analysis to account for unobserved heterogeneity in individual risks to disease and death. To analyze bivariate data on related survival times (e.g., matched pairs experiments, twin, or family data), shared frailty models were suggested. Shared frailty models are frequently used to model heterogeneity in survival analysis. The most common shared frailty model is a model in which hazard function is a product of random factor(frailty) and baseline hazard function which is common to all individuals. There are certain assumptions about the baseline distribution and distribution of frailty. In this paper, we introduce shared gamma frailty models with reversed hazard rate. We introduce Bayesian estimation procedure using Markov Chain Monte Carlo (MCMC) technique to estimate the parameters involved in the model. We present a simulation study to compare the true values of the parameters with the estimated values. Also, we apply the proposed model to the Australian twin data set.  相似文献   

11.
While Markov chain Monte Carlo (MCMC) methods are frequently used for difficult calculations in a wide range of scientific disciplines, they suffer from a serious limitation: their samples are not independent and identically distributed. Consequently, estimates of expectations are biased if the initial value of the chain is not drawn from the target distribution. Regenerative simulation provides an elegant solution to this problem. In this article, we propose a simple regenerative MCMC algorithm to generate variates for any distribution.  相似文献   

12.
We investigate the use of Markov Chain Monte Carlo (MCMC) methods to attack classical ciphers. MCMC has previously been used to break simple substitution ciphers. Here, we extend this approach to transposition ciphers and to substitution-plus-transposition ciphers. Our algorithms run quickly and perform fairly well even for key lengths as high as 40.  相似文献   

13.
Complex stochastic models, such as individual-based models, are becoming increasingly popular. However this complexity can often mean that the likelihood is intractable. Performing parameter estimation on the model can then be difficult. One way of doing this when the complex model is relatively quick to simulate from is approximate Bayesian computation (ABC). Rejection-ABC algorithm is not always efficient so numerous other algorithms have been proposed. One such method is ABC with Markov chain Monte Carlo (ABC–MCMC). Unfortunately for some models this method does not perform well and some alternatives have been proposed including the fsMCMC algorithm (Neal and Huang, in: Scand J Stat 42:378–396, 2015) that explores the random inputs space as well unknown model parameters. In this paper we extend the fsMCMC algorithm and take advantage of the joint parameter and random input space in order to get better mixing of the Markov Chain. We also introduce a Gibbs step that conditions on the current accepted model and allows the parameters to move as well as the random inputs conditional on this accepted model. We show empirically that this improves the efficiency of the ABC–MCMC algorithm on a queuing model and an individual-based model of the group-living bird, the woodhoopoe.  相似文献   

14.
In this paper, we propose a novel variance reduction approach for additive functionals of Markov chains based on minimization of an estimate for the asymptotic variance of these functionals over suitable classes of control variates. A distinctive feature of the proposed approach is its ability to significantly reduce the overall finite sample variance. This feature is theoretically demonstrated by means of a deep non-asymptotic analysis of a variance reduced functional as well as by a thorough simulation study. In particular, we apply our method to various MCMC Bayesian estimation problems where it favorably compares to the existing variance reduction approaches.  相似文献   

15.
Differential Evolution (DE) is a simple genetic algorithm for numerical optimization in real parameter spaces. In a statistical context one would not just want the optimum but also its uncertainty. The uncertainty distribution can be obtained by a Bayesian analysis (after specifying prior and likelihood) using Markov Chain Monte Carlo (MCMC) simulation. This paper integrates the essential ideas of DE and MCMC, resulting in Differential Evolution Markov Chain (DE-MC). DE-MC is a population MCMC algorithm, in which multiple chains are run in parallel. DE-MC solves an important problem in MCMC, namely that of choosing an appropriate scale and orientation for the jumping distribution. In DE-MC the jumps are simply a fixed multiple of the differences of two random parameter vectors that are currently in the population. The selection process of DE-MC works via the usual Metropolis ratio which defines the probability with which a proposal is accepted. In tests with known uncertainty distributions, the efficiency of DE-MC with respect to random walk Metropolis with optimal multivariate Normal jumps ranged from 68% for small population sizes to 100% for large population sizes and even to 500% for the 97.5% point of a variable from a 50-dimensional Student distribution. Two Bayesian examples illustrate the potential of DE-MC in practice. DE-MC is shown to facilitate multidimensional updates in a multi-chain “Metropolis-within-Gibbs” sampling approach. The advantage of DE-MC over conventional MCMC are simplicity, speed of calculation and convergence, even for nearly collinear parameters and multimodal densities.  相似文献   

16.
Bayesian estimation for population parameter under progressive type-I interval censoring is studied via Markov Chain Monte Carlo (MCMC) simulation. Two competitive statistical models, generalized exponential and Weibull distributions for modeling a real data set containing 112 patients with plasma cell myeloma, are studied for illustration. In model selection, a novel Bayesian procedure which involves a mixture model is proposed. Then the mix proportion is estimated through MCMC and used as the model selection criterion.  相似文献   

17.
In this article, we propose a bivariate long-term distribution based on the Farlie-Gumbel-Morgenstern copula model. The proposed model allows for the presence of censored data and covariates. For inferential purposes, a Bayesian approach via Markov Chain Monte Carlo (MCMC) were considered. Further, some discussions on the model selection criteria are given. In order to examine outlying and influential observations, we present a Bayesian case deletion influence diagnostics based on the Kullback-Leibler divergence. The newly developed procedures are illustrated on artificial and real data.  相似文献   

18.
We introduce a general Monte Carlo method based on Nested Sampling (NS), for sampling complex probability distributions and estimating the normalising constant. The method uses one or more particles, which explore a mixture of nested probability distributions, each successive distribution occupying ∼e −1 times the enclosed prior mass of the previous distribution. While NS technically requires independent generation of particles, Markov Chain Monte Carlo (MCMC) exploration fits naturally into this technique. We illustrate the new method on a test problem and find that it can achieve four times the accuracy of classic MCMC-based Nested Sampling, for the same computational effort; equivalent to a factor of 16 speedup. An additional benefit is that more samples and a more accurate evidence value can be obtained simply by continuing the run for longer, as in standard MCMC.  相似文献   

19.
Markov chain Monte Carlo (MCMC) routines have become a fundamental means for generating random variates from distributions otherwise difficult to sample. The Hastings sampler, which includes the Gibbs and Metropolis samplers as special cases, is the most popular MCMC method. A number of implementations are available for running these MCMC routines varying in the order through which the components or blocks of the random vector of interest X are cycled or visited. The two most common implementations are the deterministic sweep strategy, whereby the components or blocks of X are updated successively and in a fixed order, and the random sweep strategy, whereby the coordinates or blocks of X are updated in a randomly determined order. In this article, we present a general representation for MCMC updating schemes showing that the deterministic scan is a special case of the random scan. We also discuss decision criteria for choosing a sweep strategy.  相似文献   

20.
In this article we propose a novel non-parametric sampling approach to estimate posterior distributions from parameters of interest. Starting from an initial sample over the parameter space, this method makes use of this initial information to form a geometrical structure known as Voronoi tessellation over the whole parameter space. This rough approximation to the posterior distribution provides a way to generate new points from the posterior distribution without any additional costly model evaluations. By using a traditional Markov Chain Monte Carlo (MCMC) over the non-parametric tessellation, the initial approximate distribution is refined sequentially. We applied this method to a couple of climate models to show that this hybrid scheme successfully approximates the posterior distribution of the model parameters.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号