首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
A Bayesian analysis is presented of a time series which is the sum of a stationary component with a smooth spectral density and a deterministic component consisting of a linear combination of a trend and periodic terms. The periodic terms may have known or unknown frequencies. The advantage of our approach is that different features of the data—such as the regression parameters, the spectral density, unknown frequencies and missing observations—are combined in a hierarchical Bayesian framework and estimated simultaneously. A Bayesian test to detect deterministic components in the data is also constructed. By using an asymptotic approximation to the likelihood, the computation is carried out efficiently using the Markov chain Monte Carlo method in O ( Mn ) operations, where n is the sample size and M is the number of iterations. We show empirically that our approach works well on real and simulated samples.  相似文献   

2.
This paper examines methodology for performing Bayesian inference sequentially on a sequence of posteriors on spaces of different dimensions. For this, we use sequential Monte Carlo samplers, introducing the innovation of using deterministic transformations to move particles effectively between target distributions with different dimensions. This approach, combined with adaptive methods, yields an extremely flexible and general algorithm for Bayesian model comparison that is suitable for use in applications where the acceptance rate in reversible jump Markov chain Monte Carlo is low. We use this approach on model comparison for mixture models, and for inferring coalescent trees sequentially, as data arrives.  相似文献   

3.
In this paper we develop a Bayesian approach to detecting unit roots in autoregressive panel data models. Our method is based on the comparison of stationary autoregressive models with and without individual deterministic trends, to their counterpart models with a unit autoregressive root. This is done under cross-sectional dependence among the error terms of the panel units. Simulation experiments are conducted with the aim to assess the performance of the suggested inferential procedure, as well as to investigate if the Bayesian model comparison approach can distinguish unit root models from stationary autoregressive models under cross-sectional dependence. The approach is applied to real exchange rate series for a panel of the G7 countries and to a panel of US nominal interest rates data.  相似文献   

4.
Deterministic simulation models are used to guide decision-making and enhance understanding of complex systems such as disease transmission, population dynamics, and tree plantation growth. Bayesian inference about parameters in deterministic simulation models can require the pooling of expert opinion. One class of approaches to pooling expert opinion in this context is supra-Bayesian pooling, in which expert opinion is treated as data for an ultimate decision maker. This article details and compares two supra-Bayesian approaches—“event updating” and “parameter updating.” The suitability of each approach in the context of deterministic simulation models is assessed based on theoretical properties, performance on examples, and the selection and sensitivity of required hyperparameters. In general, we favor a parameter updating approach because it uses more intuitive hyperparameters, it performs sensibly on examples, and because the alternative event updating approach fails to exhibit a desirable property (relative propensity consistency) in all cases. Inference in deterministic simulation models is an increasingly important statistical and practical problem, and supra-Bayesian methods represent one viable option for achieving a sensible pooling of expert opinion.  相似文献   

5.
Multi-stage time evolving models are common statistical models for biological systems, especially insect populations. In stage-duration distribution models, parameter estimation for the models use the Laplace transform method. This method involves assumptions such as known constant shapes, known constant rates or the same overall hazard rate for all stages. These assumptions are strong and restrictive. The main aim of this paper is to weaken these assumptions by using a Bayesian approach. In particular, a Metropolis-Hastings algorithm based on deterministic transformations is used to estimate parameters. We will use two models, one which has no hazard rates, and the other has stage-wise constant hazard rates. These methods are validated in simulation studies followed by a case study of cattle parasites. The results show that the proposed methods are able to estimate the parameters comparably well, as opposed to using the Laplace transform methods.  相似文献   

6.
This paper develops a space‐time statistical model for local forecasting of surface‐level wind fields in a coastal region with complex topography. The statistical model makes use of output from deterministic numerical weather prediction models which are able to produce forecasts of surface wind fields on a spatial grid. When predicting surface winds at observing stations , errors can arise due to sub‐grid scale processes not adequately captured by the numerical weather prediction model , and the statistical model attempts to correct for these influences. In particular , it uses information from observing stations within the study region as well as topographic information to account for local bias. Bayesian methods for inference are used in the model , with computations carried out using Markov chain Monte Carlo algorithms. Empirical performance of the model is described , illustrating that a structured Bayesian approach to complicated space‐time models of the type considered in this paper can be readily implemented and can lead to improvements in forecasting over traditional methods.  相似文献   

7.
In this article, Bayesian inference for the half-normal and half-t distributions using uninformative priors is considered. It is shown that exact Bayesian inference can be undertaken for the half-normal distribution without the need for Gibbs sampling. Simulation is then used to compare the sampling properties of Bayesian point and interval estimators with those of their maximum likelihood based counterparts. Inference for the half-t distribution based on the use of Gibbs sampling is outlined, and an approach to model comparison based on the use of Bayes factors is discussed. The fitting of the half-normal and half-t models is illustrated using real data on the body fat measurements of elite athletes.  相似文献   

8.
Abstract. We study statistical procedures to quantify uncertainty in multivariate climate projections based on several deterministic climate models. We introduce two different assumptions – called constant bias and constant relation respectively – for extrapolating the substantial additive and multiplicative biases present during the control period to the scenario period. There are also strong indications that the biases in the scenario period are different from the extrapolations from the control period. Including such changes in the statistical models leads to an identifiability problem that we solve in a frequentist analysis using a zero sum side condition and in a Bayesian analysis using informative priors. The Bayesian analysis provides estimates of the uncertainty in the parameter estimates and takes this uncertainty into account for the predictive distributions. We illustrate the method by analysing projections of seasonal temperature and precipitation in the Alpine region from five regional climate models in the PRUDENCE project.  相似文献   

9.
Bayesian analysis of a bivariate survival model based on exponential distributions is discussed using both vague and conjugate prior distributions. Parameter and reliability estimators are given for the maximum likelihood technique and the Bayesian approach using both types of priors. A Monte Carlo study indicates the vague prior Bayes estimator of reliability performs better than its maximum likelihood counterpart.  相似文献   

10.
Multivariate model validation is a complex decision-making problem involving comparison of multiple correlated quantities, based upon the available information and prior knowledge. This paper presents a Bayesian risk-based decision method for validation assessment of multivariate predictive models under uncertainty. A generalized likelihood ratio is derived as a quantitative validation metric based on Bayes’ theorem and Gaussian distribution assumption of errors between validation data and model prediction. The multivariate model is then assessed based on the comparison of the likelihood ratio with a Bayesian decision threshold, a function of the decision costs and prior of each hypothesis. The probability density function of the likelihood ratio is constructed using the statistics of multiple response quantities and Monte Carlo simulation. The proposed methodology is implemented in the validation of a transient heat conduction model, using a multivariate data set from experiments. The Bayesian methodology provides a quantitative approach to facilitate rational decisions in multivariate model assessment under uncertainty.  相似文献   

11.
The Bayesian analysis based on the partial likelihood for Cox's proportional hazards model is frequently used because of its simplicity. The Bayesian partial likelihood approach is often justified by showing that it approximates the full Bayesian posterior of the regression coefficients with a diffuse prior on the baseline hazard function. This, however, may not be appropriate when ties exist among uncensored observations. In that case, the full Bayesian and Bayesian partial likelihood posteriors can be much different. In this paper, we propose a new Bayesian partial likelihood approach for many tied observations and justify its use.  相似文献   

12.
In this paper, we present a Bayesian approach for inference from accelerated life tests when the underlying life model is Weibull. Our approach is based on the General Linear Models framework of West, Harrison and Migon (1985). We discuss inference for the model and show that computable results can be obtained using linear Bayesian methods. We illustrate the usefulness of our approach by applying it to some actual data from accelerated life tests. This revised version was published online in July 2006 with corrections to the Cover Date.  相似文献   

13.
In this article, static light scattering (SLS) measurements are processed to estimate the particle size distribution of particle systems incorporating prior information obtained from an alternative experimental technique: scanning electron microscopy (SEM). For this purpose we propose two Bayesian schemes (one parametric and another non-parametric) to solve the stated light scattering problem and take advantage of the obtained results to summarize some features of the Bayesian approach within the context of inverse problems. The features presented in this article include the improvement of the results when some useful prior information from an alternative experiment is considered instead of a non-informative prior as it occurs in a deterministic maximum likelihood estimation. This improvement will be shown in terms of accuracy and precision in the corresponding results and also in terms of minimizing the effect of multiple minima by including significant information in the optimization. Both Bayesian schemes are implemented using Markov Chain Monte Carlo methods. They have been developed on the basis of the Metropolis–Hastings (MH) algorithm using Matlab® and are tested with the analysis of simulated and experimental examples of concentrated and semi-concentrated particles. In the simulated examples, SLS measurements were generated using a rigorous model, while the inversion stage was solved using an approximate model in both schemes and also using the rigorous model in the parametric scheme. Priors from SEM micrographs were also simulated and experimented, where the simulated ones were obtained using a Monte Carlo routine. In addition to the presentation of these features of the Bayesian approach, some other topics will be discussed, such as regularization and some implementation issues of the proposed schemes, among which we remark the selection of the parameters used in the MH algorithm.  相似文献   

14.
We propose quantile regression (QR) in the Bayesian framework for a class of nonlinear mixed effects models with a known, parametric model form for longitudinal data. Estimation of the regression quantiles is based on a likelihood-based approach using the asymmetric Laplace density. Posterior computations are carried out via Gibbs sampling and the adaptive rejection Metropolis algorithm. To assess the performance of the Bayesian QR estimator, we compare it with the mean regression estimator using real and simulated data. Results show that the Bayesian QR estimator provides a fuller examination of the shape of the conditional distribution of the response variable. Our approach is proposed for parametric nonlinear mixed effects models, and therefore may not be generalized to models without a given model form.  相似文献   

15.
Based on ordered ranked set sample, Bayesian estimation of the model parameter as well as prediction of the unobserved data from Rayleigh distribution are studied. The Bayes estimates of the parameter involved are obtained using both squared error and asymmetric loss functions. The Bayesian prediction approach is considered for predicting the unobserved lifetimes based on a two-sample prediction problem. A real life dataset and simulation study are used to illustrate our procedures.  相似文献   

16.
The Bayesian analysis of outliers using a non-informative prior for the parameters is non-trivial because models with different numbers of outliers have different dimensions. A quasi-Bayesian approach based on the Akaike's predictive likelihood is proposed for the analysis of regression outliers. It overcomes the dimensionality problem in Bayesian outlier analysis in which the likelihood of the outlier model is compensated by a correction factor adjusted for the number of outliers. The stack loss data set is analysed with satisfactory results.  相似文献   

17.
This paper presents a Bayesian analysis of partially linear additive models for quantile regression. We develop a semiparametric Bayesian approach to quantile regression models using a spectral representation of the nonparametric regression functions and the Dirichlet process (DP) mixture for error distribution. We also consider Bayesian variable selection procedures for both parametric and nonparametric components in a partially linear additive model structure based on the Bayesian shrinkage priors via a stochastic search algorithm. Based on the proposed Bayesian semiparametric additive quantile regression model referred to as BSAQ, the Bayesian inference is considered for estimation and model selection. For the posterior computation, we design a simple and efficient Gibbs sampler based on a location-scale mixture of exponential and normal distributions for an asymmetric Laplace distribution, which facilitates the commonly used collapsed Gibbs sampling algorithms for the DP mixture models. Additionally, we discuss the asymptotic property of the sempiparametric quantile regression model in terms of consistency of posterior distribution. Simulation studies and real data application examples illustrate the proposed method and compare it with Bayesian quantile regression methods in the literature.  相似文献   

18.
The Finnish common toad data of Heikkinen and Hogmander are reanalysed using an alternative fully Bayesian model that does not require a pseudolikelihood approximation and an alternative prior distribution for the true presence or absence status of toads in each 10 km×10 km square. Markov chain Monte Carlo methods are used to obtain posterior probability estimates of the square-specific presences of the common toad and these are presented as a map. The results are different from those of Heikkinen and Hogmander and we offer an explanation in terms of the prior used for square-specific presence of the toads. We suggest that our approach is more faithful to the data and avoids unnecessary confounding of effects. We demonstrate how to extend our model efficiently with square-specific covariates and illustrate this by introducing deterministic spatial changes.  相似文献   

19.
The usual practice in using a Bayesian control chart to monitor a process is done by taking samples from the process with fixed sampling intervals. Recent studies on traditional control charts have shown that variable sampling interval (VSI) scheme compared to classical scheme (fixed ratio sampling, FRS) helps practitioners to detect process shifts more quickly. In this paper, the effectiveness of VSI scheme on performance of Bayesian control chart has been studied, based on economic (ED) and economic–statistical designs (ESD). Monte Carlo method and artificial bee colony algorithm have been utilized to obtain optimal design parameters of Bayesian control chart (sample size, sampling intervals, warning limit and control limit) since the statistic of this approach does not have any specified distribution. Finally, VSI Bayesian control chart has been compared to FRS Bayesian and VSI X-bar approaches based on ED and ESD, separately. According to the results, it has been found that the performance of VSI Bayesian scheme is better than FRS Bayesian and VSI X-bar approaches.  相似文献   

20.
In recent years, there has been considerable interest in regression models based on zero-inflated distributions. These models are commonly encountered in many disciplines, such as medicine, public health, and environmental sciences, among others. The zero-inflated Poisson (ZIP) model has been typically considered for these types of problems. However, the ZIP model can fail if the non-zero counts are overdispersed in relation to the Poisson distribution, hence the zero-inflated negative binomial (ZINB) model may be more appropriate. In this paper, we present a Bayesian approach for fitting the ZINB regression model. This model considers that an observed zero may come from a point mass distribution at zero or from the negative binomial model. The likelihood function is utilized to compute not only some Bayesian model selection measures, but also to develop Bayesian case-deletion influence diagnostics based on q-divergence measures. The approach can be easily implemented using standard Bayesian software, such as WinBUGS. The performance of the proposed method is evaluated with a simulation study. Further, a real data set is analyzed, where we show that ZINB regression models seems to fit the data better than the Poisson counterpart.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号