首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Importance sampling and Markov chain Monte Carlo methods have been used in exact inference for contingency tables for a long time, however, their performances are not always very satisfactory. In this paper, we propose a stochastic approximation Monte Carlo importance sampling (SAMCIS) method for tackling this problem. SAMCIS is a combination of adaptive Markov chain Monte Carlo and importance sampling, which employs the stochastic approximation Monte Carlo algorithm (Liang et al., J. Am. Stat. Assoc., 102(477):305–320, 2007) to draw samples from an enlarged reference set with a known Markov basis. Compared to the existing importance sampling and Markov chain Monte Carlo methods, SAMCIS has a few advantages, such as fast convergence, ergodicity, and the ability to achieve a desired proportion of valid tables. The numerical results indicate that SAMCIS can outperform the existing importance sampling and Markov chain Monte Carlo methods: It can produce much more accurate estimates in much shorter CPU time than the existing methods, especially for the tables with high degrees of freedom.  相似文献   

2.
Simulated maximum likelihood estimates an analytically intractable likelihood function with an empirical average based on data simulated from a suitable importance sampling distribution. In order to use simulated maximum likelihood in an efficient way, the choice of the importance sampling distribution as well as the mechanism to generate the simulated data are crucial. In this paper we develop a new heuristic for an automated, multistage implementation of simulated maximum likelihood which, by adaptively updating the importance sampler, approximates the (locally) optimal importance sampling distribution. The proposed approach also allows for a convenient incorporation of quasi-Monte Carlo methods. Quasi-Monte Carlo methods produce simulated data which can significantly increase the accuracy of the likelihood-estimate over regular Monte Carlo methods. Several examples provide evidence for the potential efficiency gain of this new method. We apply the method to a computationally challenging geostatistical model of online retailing.  相似文献   

3.
Monte Carlo methods represent the de facto standard for approximating complicated integrals involving multidimensional target distributions. In order to generate random realizations from the target distribution, Monte Carlo techniques use simpler proposal probability densities to draw candidate samples. The performance of any such method is strictly related to the specification of the proposal distribution, such that unfortunate choices easily wreak havoc on the resulting estimators. In this work, we introduce a layered (i.e., hierarchical) procedure to generate samples employed within a Monte Carlo scheme. This approach ensures that an appropriate equivalent proposal density is always obtained automatically (thus eliminating the risk of a catastrophic performance), although at the expense of a moderate increase in the complexity. Furthermore, we provide a general unified importance sampling (IS) framework, where multiple proposal densities are employed and several IS schemes are introduced by applying the so-called deterministic mixture approach. Finally, given these schemes, we also propose a novel class of adaptive importance samplers using a population of proposals, where the adaptation is driven by independent parallel or interacting Markov chain Monte Carlo (MCMC) chains. The resulting algorithms efficiently combine the benefits of both IS and MCMC methods.  相似文献   

4.
In the expectation–maximization (EM) algorithm for maximum likelihood estimation from incomplete data, Markov chain Monte Carlo (MCMC) methods have been used in change-point inference for a long time when the expectation step is intractable. However, the conventional MCMC algorithms tend to get trapped in local mode in simulating from the posterior distribution of change points. To overcome this problem, in this paper we propose a stochastic approximation Monte Carlo version of EM (SAMCEM), which is a combination of adaptive Markov chain Monte Carlo and EM utilizing a maximum likelihood method. SAMCEM is compared with the stochastic approximation version of EM and reversible jump Markov chain Monte Carlo version of EM on simulated and real datasets. The numerical results indicate that SAMCEM can outperform among the three methods by producing much more accurate parameter estimates and the ability to achieve change-point positions and estimates simultaneously.  相似文献   

5.
We consider a Bayesian deterministically trending dynamic time series model with heteroscedastic error variance, in which there exist multiple structural changes in level, trend and error variance, but the number of change-points and the timings are unknown. For a Bayesian analysis, a truncated Poisson prior and conjugate priors are used for the number of change-points and the distributional parameters, respectively. To identify the best model and estimate the model parameters simultaneously, we propose a new method by sequentially making use of the Gibbs sampler in conjunction with stochastic approximation Monte Carlo simulations, as an adaptive Monte Carlo algorithm. The numerical results are in favor of our method in terms of the quality of estimates.  相似文献   

6.
Markov chain Monte Carlo (MCMC) is an important computational technique for generating samples from non-standard probability distributions. A major challenge in the design of practical MCMC samplers is to achieve efficient convergence and mixing properties. One way to accelerate convergence and mixing is to adapt the proposal distribution in light of previously sampled points, thus increasing the probability of acceptance. In this paper, we propose two new adaptive MCMC algorithms based on the Independent Metropolis–Hastings algorithm. In the first, we adjust the proposal to minimize an estimate of the cross-entropy between the target and proposal distributions, using the experience of pre-runs. This approach provides a general technique for deriving natural adaptive formulae. The second approach uses multiple parallel chains, and involves updating chains individually, then updating a proposal density by fitting a Bayesian model to the population. An important feature of this approach is that adapting the proposal does not change the limiting distributions of the chains. Consequently, the adaptive phase of the sampler can be continued indefinitely. We include results of numerical experiments indicating that the new algorithms compete well with traditional Metropolis–Hastings algorithms. We also demonstrate the method for a realistic problem arising in Comparative Genomics.  相似文献   

7.
Two new implementations of the EM algorithm are proposed for maximum likelihood fitting of generalized linear mixed models. Both methods use random (independent and identically distributed) sampling to construct Monte Carlo approximations at the E-step. One approach involves generating random samples from the exact conditional distribution of the random effects (given the data) by rejection sampling, using the marginal distribution as a candidate. The second method uses a multivariate t importance sampling approximation. In many applications the two methods are complementary. Rejection sampling is more efficient when sample sizes are small, whereas importance sampling is better with larger sample sizes. Monte Carlo approximation using random samples allows the Monte Carlo error at each iteration to be assessed by using standard central limit theory combined with Taylor series methods. Specifically, we construct a sandwich variance estimate for the maximizer at each approximate E-step. This suggests a rule for automatically increasing the Monte Carlo sample size after iterations in which the true EM step is swamped by Monte Carlo error. In contrast, techniques for assessing Monte Carlo error have not been developed for use with alternative implementations of Monte Carlo EM algorithms utilizing Markov chain Monte Carlo E-step approximations. Three different data sets, including the infamous salamander data of McCullagh and Nelder, are used to illustrate the techniques and to compare them with the alternatives. The results show that the methods proposed can be considerably more efficient than those based on Markov chain Monte Carlo algorithms. However, the methods proposed may break down when the intractable integrals in the likelihood function are of high dimension.  相似文献   

8.
In this article, we propose a Bayesian approach to estimate the multiple structural change-points in a level and the trend when the number of change-points is unknown. Our formulation of the structural-change model involves a binary discrete variable that indicates the structural change. The determination of the number and the form of structural changes are considered as a model selection issue in Bayesian structural-change analysis. We apply an advanced Monte Carlo algorithm, the stochastic approximation Monte Carlo (SAMC) algorithm, to this structural-change model selection issue. SAMC effectively functions for the complex structural-change model estimation, since it prevents entrapment in local posterior mode. The estimation of the model parameters in each regime is made using the Gibbs sampler after each change-point is detected. The performance of our proposed method has been investigated on simulated and real data sets, a long time series of US real gross domestic product, US uses of force between 1870 and 1994 and 1-year time series of temperature in Seoul, South Korea.  相似文献   

9.
In this paper, we propose a new algorithm, the so-called annealing evolutionary stochastic approximation Monte Carlo (AESAMC) algorithm as a general optimization technique, and study its convergence. AESAMC possesses a self-adjusting mechanism, whose target distribution can be adapted at each iteration according to the current samples. Thus, AESAMC falls into the class of adaptive Monte Carlo methods. This mechanism also makes AESAMC less trapped by local energy minima than nonadaptive MCMC algorithms. Under mild conditions, we show that AESAMC can converge weakly toward a neighboring set of global minima in the space of energy. AESAMC is tested on multiple optimization problems. The numerical results indicate that AESAMC can potentially outperform simulated annealing, the genetic algorithm, annealing stochastic approximation Monte Carlo, and some other metaheuristics in function optimization.  相似文献   

10.
Sequential Monte Carlo methods (also known as particle filters and smoothers) are used for filtering and smoothing in general state-space models. These methods are based on importance sampling. In practice, it is often difficult to find a suitable proposal which allows effective importance sampling. This article develops an original particle filter and an original particle smoother which employ nonparametric importance sampling. The basic idea is to use a nonparametric estimate of the marginally optimal proposal. The proposed algorithms provide a better approximation of the filtering and smoothing distributions than standard methods. The methods’ advantage is most distinct in severely nonlinear situations. In contrast to most existing methods, they allow the use of quasi-Monte Carlo (QMC) sampling. In addition, they do not suffer from weight degeneration rendering a resampling step unnecessary. For the estimation of model parameters, an efficient on-line maximum-likelihood (ML) estimation technique is proposed which is also based on nonparametric approximations. All suggested algorithms have almost linear complexity for low-dimensional state-spaces. This is an advantage over standard smoothing and ML procedures. Particularly, all existing sequential Monte Carlo methods that incorporate QMC sampling have quadratic complexity. As an application, stochastic volatility estimation for high-frequency financial data is considered, which is of great importance in practice. The computer code is partly available as supplemental material.  相似文献   

11.
Full likelihood-based inference for modern population genetics data presents methodological and computational challenges. The problem is of considerable practical importance and has attracted recent attention, with the development of algorithms based on importance sampling (IS) and Markov chain Monte Carlo (MCMC) sampling. Here we introduce a new IS algorithm. The optimal proposal distribution for these problems can be characterized, and we exploit a detailed analysis of genealogical processes to develop a practicable approximation to it. We compare the new method with existing algorithms on a variety of genetic examples. Our approach substantially outperforms existing IS algorithms, with efficiency typically improved by several orders of magnitude. The new method also compares favourably with existing MCMC methods in some problems, and less favourably in others, suggesting that both IS and MCMC methods have a continuing role to play in this area. We offer insights into the relative advantages of each approach, and we discuss diagnostics in the IS framework.  相似文献   

12.
Monte Carlo methods for the exact inference have received much attention recently in complete or incomplete contingency table analysis. However, conventional Markov chain Monte Carlo, such as the Metropolis–Hastings algorithm, and importance sampling methods sometimes generate the poor performance by failing to produce valid tables. In this paper, we apply an adaptive Monte Carlo algorithm, the stochastic approximation Monte Carlo algorithm (SAMC; Liang, Liu, & Carroll, 2007), to the exact test of the goodness-of-fit of the model in complete or incomplete contingency tables containing some structural zero cells. The numerical results are in favor of our method in terms of quality of estimates.  相似文献   

13.
We present a variant of the sequential Monte Carlo sampler by incorporating the partial rejection control mechanism of Liu (2001). We show that the resulting algorithm can be considered as a sequential Monte Carlo sampler with a modified mutation kernel. We prove that the new sampler can reduce the variance of the incremental importance weights when compared with standard sequential Monte Carlo samplers, and provide a central limit theorem. Finally, the sampler is adapted for application under the challenging approximate Bayesian computation modelling framework.  相似文献   

14.
Geometric process (GP) is widely used as a non-stationary stochastic model in reliability analysis. In many of applications related with GP its mean value and variance functions are needed. Since there are no analytical forms of these functions in a lot of situations their computations are of importance. In this study, a numerical approximation and Monte Carlo estimation method based on the convolutions of distribution functions have been proposed for both the mean value and variance functions.  相似文献   

15.
A Monte Carlo (MC) method is suggested for calculating an upper prediction limit for the mean of a future sample of small size N from a lognormal distribution. This is done by obtaining a Monte Carlo estimator of the limit utilizing the future sample generated from the Gibbs sampler. For the Gibbs sampler, a full conditional posterior predictive distribution of each observation in the future sample is derived. The MC method is straightforward to specify distributionally and to implement computationally, with output readily adapted for required inference summaries. In an example, practical application of the method is described.  相似文献   

16.
This article focuses on simulation-based inference for the time-deformation models directed by a duration process. In order to better capture the heavy tail property of the time series of financial asset returns, the innovation of the observation equation is subsequently assumed to have a Student-t distribution. Suitable Markov chain Monte Carlo (MCMC) algorithms, which are hybrids of Gibbs and slice samplers, are proposed for estimation of the parameters of these models. In the algorithms, the parameters of the models can be sampled either directly from known distributions or through an efficient slice sampler. The states are simulated one at a time by using a Metropolis-Hastings method, where the proposal distributions are sampled through a slice sampler. Simulation studies conducted in this article suggest that our extended models and accompanying MCMC algorithms work well in terms of parameter estimation and volatility forecast.  相似文献   

17.
An automated (Markov chain) Monte Carlo EM algorithm   总被引:1,自引:0,他引:1  
We present an automated Monte Carlo EM (MCEM) algorithm which efficiently assesses Monte Carlo error in the presence of dependent Monte Carlo, particularly Markov chain Monte Carlo, E-step samples and chooses an appropriate Monte Carlo sample size to minimize this Monte Carlo error with respect to progressive EM step estimates. Monte Carlo error is gauged though an application of the central limit theorem during renewal periods of the MCMC sampler used in the E-step. The resulting normal approximation allows us to construct a rigorous and adaptive rule for updating the Monte Carlo sample size each iteration of the MCEM algorithm. We illustrate our automated routine and compare the performance with competing MCEM algorithms in an analysis of a data set fit by a generalized linear mixed model.  相似文献   

18.
A general saddlepoint/Monte Carlo method to approximate (conditional) multivariate probabilities is presented. This method requires a tractable joint moment generating function (m.g.f.), but does not require a tractable distribution or density. The method is easy to program and has a third-order accuracy with respect to increasing sample size in contrast to standard asymptotic approximations which are typically only accurate to the first order.

The method is most easily described in the context of a continuous regular exponential family. Here, inferences can be formulated as probabilities with respect to the joint density of the sufficient statistics or the conditional density of some sufficient statistics given the others. Analytical expressions for these densities are not generally available, and it is often not possible to simulate exactly from the conditional distributions to obtain a direct Monte Carlo approximation of the required integral. A solution to the first of these problems is to replace the intractable density by a highly accurate saddlepoint approximation. The second problem can be addressed via importance sampling, that is, an indirect Monte Carlo approximation involving simulation from a crude approximation to the true density. Asymptotic normality of the sufficient statistics suggests an obvious candidate for an importance distribution.

The more general problem considers the computation of a joint probability for a subvector of random T, given its complementary subvector, when its distribution is intractable, but its joint m.g.f. is computable. For such settings, the distribution may be tilted, maintaining T as the sufficient statistic. Within this tilted family, the computation of such multivariate probabilities proceeds as described for the exponential family setting.  相似文献   

19.
A Monte Carlo algorithm is said to be adaptive if it automatically calibrates its current proposal distribution using past simulations. The choice of the parametric family that defines the set of proposal distributions is critical for good performance. In this paper, we present such a parametric family for adaptive sampling on high dimensional binary spaces. A practical motivation for this problem is variable selection in a linear regression context. We want to sample from a Bayesian posterior distribution on the model space using an appropriate version of Sequential Monte Carlo. Raw versions of Sequential Monte Carlo are easily implemented using binary vectors with independent components. For high dimensional problems, however, these simple proposals do not yield satisfactory results. The key to an efficient adaptive algorithm are binary parametric families which take correlations into account, analogously to the multivariate normal distribution on continuous spaces. We provide a review of models for binary data and make one of them work in the context of Sequential Monte Carlo sampling. Computational studies on real life data with about a hundred covariates suggest that, on difficult instances, our Sequential Monte Carlo approach clearly outperforms standard techniques based on Markov chain exploration.  相似文献   

20.
We introduce a new class of interacting Markov chain Monte Carlo (MCMC) algorithms which is designed to increase the efficiency of a modified multiple-try Metropolis (MTM) sampler. The extension with respect to the existing MCMC literature is twofold. First, the sampler proposed extends the basic MTM algorithm by allowing for different proposal distributions in the multiple-try generation step. Second, we exploit the different proposal distributions to naturally introduce an interacting MTM mechanism (IMTM) that expands the class of population Monte Carlo methods and builds connections with the rapidly expanding world of adaptive MCMC. We show the validity of the algorithm and discuss the choice of the selection weights and of the different proposals. The numerical studies show that the interaction mechanism allows the IMTM to efficiently explore the state space leading to higher efficiency than other competing algorithms.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号