首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 843 毫秒
1.
A method is suggested to estimate posterior model probabilities and model averaged parameters via MCMC sampling under a Bayesian approach. The estimates use pooled output for J models (J>1) whereby all models are updated at each iteration. Posterior probabilities are based on averages of continuous weights obtained for each model at each iteration, while samples of averaged parameters are obtained from iteration specific averages that are based on these weights. Parallel sampling of models assists in deriving posterior densities for parameter contrasts between models and in assessing hypotheses regarding model averaged parameters. Four worked examples illustrate application of the approach, two involving fixed effect regression, and two involving random effects.  相似文献   

2.
Bayesian Semiparametric Regression for Median Residual Life   总被引:3,自引:0,他引:3  
Abstract.  With survival data there is often interest not only in the survival time distribution but also in the residual survival time distribution. In fact, regression models to explain residual survival time might be desired. Building upon recent work of Kottas & Gelfand [ J. Amer. Statist. Assoc. 96 (2001) 1458], we formulate a semiparametric median residual life regression model induced by a semiparametric accelerated failure time regression model. We utilize a Bayesian approach which allows full and exact inference. Classical work essentially ignores covariates and is either based upon parametric assumptions or is limited to asymptotic inference in non-parametric settings. No regression modelling of median residual life appears to exist. The Bayesian modelling is developed through Dirichlet process mixing. The models are fitted using Gibbs sampling. Residual life inference is implemented extending the approach of Gelfand & Kottas [ J. Comput. Graph. Statist. 11 (2002) 289]. Finally, we present a fairly detailed analysis of a set of survival times with moderate censoring for patients with small cell lung cancer.  相似文献   

3.
The Gibbs sampler has been proposed as a general method for Bayesian calculation in Gelfand and Smith (1990). However, the predominance of experience to date resides in applications assuming conjugacy where implementation is reasonably straightforward. This paper describes a tailored approximate rejection method approach for implementation of the Gibbs sampler when nonconjugate structure is present. Several challenging applications are presented for illustration.  相似文献   

4.
The problem of interest is to estimate the home run ability of 12 great major league players. The usual career home run statistics are the total number of home runs hit and the overall rate at which the players hit them. The observed rate provides a point estimate for a player's “true” rate of hitting a home run. However, this point estimate is incomplete in that it ignores sampling errors, it includes seasons where the player has unusually good or poor performances, and it ignores the general pattern of performance of a player over his career. The observed rate statistic also does not distinguish between the peak and career performance of a given player. Given the random effects model of West (1985), one can detect aberrant seasons and estimate parameters of interest by the inspection of various posterior distributions. Posterior moments of interest are easily computed by the application of the Gibbs sampling algorithm (Gelfand and Smith 1990). A player's career performance is modeled using a log-linear model, and peak and career home run measures for the 12 players are estimated.  相似文献   

5.
The multivariate regression model is considered with p regressors. A latent vector with p binary entries serves to identify one of two types of regression coefficients: those close to 0 and those not. Specializing our general distributional setting to the linear model with Gaussian errors and using natural conjugate prior distributions, we derive the marginal posterior distribution of the binary latent vector. Fast algorithms aid its direct computation, and in high dimensions these are supplemented by a Markov chain Monte Carlo approach to sampling from the known posterior distribution. Problems with hundreds of regressor variables become quite feasible. We give a simple method of assigning the hyperparameters of the prior distribution. The posterior predictive distribution is derived and the approach illustrated on compositional analysis of data involving three sugars with 160 near infrared absorbances as regressors.  相似文献   

6.
Empirical Bayes approaches have often been applied to the problem of estimating small-area parameters. As a compromise between synthetic and direct survey estimators, an estimator based on an empirical Bayes procedure is not subject to the large bias that is sometimes associated with a synthetic estimator, nor is it as variable as a direct survey estimator. Although the point estimates perform very well, naïve empirical Bayes confidence intervals tend to be too short to attain the desired coverage probability, since they fail to incorporate the uncertainty which results from having to estimate the prior distribution. Several alternative methodologies for interval estimation which correct for the deficiencies associated with the naïve approach have been suggested. Laird and Louis (1987) proposed three types of bootstrap for correcting naïve empirical Bayes confidence intervals. Calling the methodology of Laird and Louis (1987) an unconditional bias-corrected naïve approach, Carlin and Gelfand (1991) suggested a modification to the Type III parametric bootstrap which corrects for bias in the naïve intervals by conditioning on the data. Here we empirically evaluate the Type II and Type III bootstrap proposed by Laird and Louis, as well as the modification suggested by Carlin and Gelfand (1991), with the objective of examining coverage properties of empirical Bayes confidence intervals for small-area proportions.  相似文献   

7.
A Bayesian model consists of two elements: a sampling model and a prior density. The problem of selecting a prior density is nothing but the problem of selecting a Bayesian model where the sampling model is fixed. A predictive approach is used through a decision problem where the loss function is the squared L 2 distance between the sampling density and the posterior predictive density, because the aim of the method is to choose the prior that provides a posterior predictive density as good as possible. An algorithm is developed for solving the problem; this algorithm is based on Lavine's linearization technique.  相似文献   

8.
Empirical Bayes (EB) methodology is now widely used in statistics. However, construction of EB confidence intervals is still very limited. Following Cox (1975 ), Hill (1990 ) and Carlin & Gelfand (1990 , 1991 ), we consider EB confidence intervals, which are adjusted so that the actual coverage probabilities asymptotically meet the target coverage probabilities up to the second order. We consider both unconditional and conditional coverage, conditioning being done with respect to an ancillary statistic.  相似文献   

9.
The term ‘small area’ or ‘small domain’ is commonly used to denote a small geographical area that has a small subpopulation of people within a large area. Small area estimation is an important area in survey sampling because of the growing demand for better statistical inference for small areas in public or private surveys. In small area estimation problems the focus is on how to borrow strength across areas in order to develop a reliable estimator and which makes use of available auxiliary information. Some traditional methods for small area problems such as empirical best linear unbiased prediction borrow strength through linear models that provide links to related areas, which may not be appropriate for some survey data. In this article, we propose a stepwise Bayes approach which borrows strength through an objective posterior distribution. This approach results in a generalized constrained Dirichlet posterior estimator when auxiliary information is available for small areas. The objective posterior distribution is based only on the assumption of exchangeability across related areas and does not make any explicit model assumptions. The form of our posterior distribution allows us to assign a weight to each member of the sample. These weights can then be used in a straight forward fashion to make inferences about the small area means. Theoretically, the stepwise Bayes character of the posterior allows one to prove the admissibility of the point estimators suggesting that inferential procedures based on this approach will tend to have good frequentist properties. Numerically, we demonstrate in simulations that the proposed stepwise Bayes approach can have substantial strengths compared to traditional methods.  相似文献   

10.

Bayesian analysis often concerns an evaluation of models with different dimensionality as is necessary in, for example, model selection or mixture models. To facilitate this evaluation, transdimensional Markov chain Monte Carlo (MCMC) relies on sampling a discrete indexing variable to estimate the posterior model probabilities. However, little attention has been paid to the precision of these estimates. If only few switches occur between the models in the transdimensional MCMC output, precision may be low and assessment based on the assumption of independent samples misleading. Here, we propose a new method to estimate the precision based on the observed transition matrix of the model-indexing variable. Assuming a first-order Markov model, the method samples from the posterior of the stationary distribution. This allows assessment of the uncertainty in the estimated posterior model probabilities, model ranks, and Bayes factors. Moreover, the method provides an estimate for the effective sample size of the MCMC output. In two model selection examples, we show that the proposed approach provides a good assessment of the uncertainty associated with the estimated posterior model probabilities.

  相似文献   

11.
For noninformative nonparametric estimation of finite population quantiles under simple random sampling, estimation based on the Polya posterior is similar to estimation based on the Bayesian approach developed by Ericson (J. Roy. Statist. Soc. Ser. B 31 (1969) 195) in that the Polya posterior distribution is the limit of Ericson's posterior distributions as the weight placed on the prior distribution diminishes. Furthermore, Polya posterior quantile estimates can be shown to be admissible under certain conditions. We demonstrate the admissibility of the sample median as an estimate of the population median under such a set of conditions. As with Ericson's Bayesian approach, Polya posterior-based interval estimates for population quantiles are asymptotically equivalent to the interval estimates obtained from standard frequentist approaches. In addition, for small to moderate sized populations, Polya posterior-based interval estimates for quantiles of a continuous characteristic of interest tend to agree with the standard frequentist interval estimates.  相似文献   

12.
Lin  Tsung I.  Lee  Jack C.  Ni  Huey F. 《Statistics and Computing》2004,14(2):119-130
A finite mixture model using the multivariate t distribution has been shown as a robust extension of normal mixtures. In this paper, we present a Bayesian approach for inference about parameters of t-mixture models. The specifications of prior distributions are weakly informative to avoid causing nonintegrable posterior distributions. We present two efficient EM-type algorithms for computing the joint posterior mode with the observed data and an incomplete future vector as the sample. Markov chain Monte Carlo sampling schemes are also developed to obtain the target posterior distribution of parameters. The advantages of Bayesian approach over the maximum likelihood method are demonstrated via a set of real data.  相似文献   

13.
This paper presents a Bayesian analysis of partially linear additive models for quantile regression. We develop a semiparametric Bayesian approach to quantile regression models using a spectral representation of the nonparametric regression functions and the Dirichlet process (DP) mixture for error distribution. We also consider Bayesian variable selection procedures for both parametric and nonparametric components in a partially linear additive model structure based on the Bayesian shrinkage priors via a stochastic search algorithm. Based on the proposed Bayesian semiparametric additive quantile regression model referred to as BSAQ, the Bayesian inference is considered for estimation and model selection. For the posterior computation, we design a simple and efficient Gibbs sampler based on a location-scale mixture of exponential and normal distributions for an asymmetric Laplace distribution, which facilitates the commonly used collapsed Gibbs sampling algorithms for the DP mixture models. Additionally, we discuss the asymptotic property of the sempiparametric quantile regression model in terms of consistency of posterior distribution. Simulation studies and real data application examples illustrate the proposed method and compare it with Bayesian quantile regression methods in the literature.  相似文献   

14.
For small area estimation of area‐level data, the Fay–Herriot model is extensively used as a model‐based method. In the Fay–Herriot model, it is conventionally assumed that the sampling variances are known, whereas estimators of sampling variances are used in practice. Thus, the settings of knowing sampling variances are unrealistic, and several methods are proposed to overcome this problem. In this paper, we assume the situation where the direct estimators of the sampling variances are available as well as the sample means. Using this information, we propose a Bayesian yet objective method producing shrinkage estimation of both means and variances in the Fay–Herriot model. We consider the hierarchical structure for the sampling variances, and we set uniform prior on model parameters to keep objectivity of the proposed model. For validity of the posterior inference, we show under mild conditions that the posterior distribution is proper and has finite variances. We investigate the numerical performance through simulation and empirical studies.  相似文献   

15.
We consider continuous time Markovian processes where populations of individual agents interact stochastically according to kinetic rules. Despite the increasing prominence of such models in fields ranging from biology to smart cities, Bayesian inference for such systems remains challenging, as these are continuous time, discrete state systems with potentially infinite state-space. Here we propose a novel efficient algorithm for joint state/parameter posterior sampling in population Markov Jump processes. We introduce a class of pseudo-marginal sampling algorithms based on a random truncation method which enables a principled treatment of infinite state spaces. Extensive evaluation on a number of benchmark models shows that this approach achieves considerable savings compared to state of the art methods, retaining accuracy and fast convergence. We also present results on a synthetic biology data set showing the potential for practical usefulness of our work.  相似文献   

16.
Abstract

In this paper we develop a Bayesian analysis for the nonlinear regression model with errors that follow a continuous autoregressive process. In this way, unequally spaced observations do not present a problem in the analysis. We employ the Gibbs sampler, (see Gelfand, A., Smith, A. (1990 Gelfand, A. and Smith, A. 1990. Sampling based approaches to calculating marginal densities. J. Amer. Statist. Assoc., 85: 398409. [Taylor & Francis Online], [Web of Science ®] [Google Scholar]). Sampling based approaches to calculating marginal densities. J. Amer. Statist. Assoc. 85:398–409.), as the foundation for making Bayesian inferences. We illustrate these Bayesian inferences with an analysis of a real data-set. Using these same data, we contrast the Bayesian approach with a generalized least squares technique.  相似文献   

17.

Item response models are essential tools for analyzing results from many educational and psychological tests. Such models are used to quantify the probability of correct response as a function of unobserved examinee ability and other parameters explaining the difficulty and the discriminatory power of the questions in the test. Some of these models also incorporate a threshold parameter for the probability of the correct response to account for the effect of guessing the correct answer in multiple choice type tests. In this article we consider fitting of such models using the Gibbs sampler. A data augmentation method to analyze a normal-ogive model incorporating a threshold guessing parameter is introduced and compared with a Metropolis-Hastings sampling method. The proposed method is an order of magnitude more efficient than the existing method. Another objective of this paper is to develop Bayesian model choice techniques for model discrimination. A predictive approach based on a variant of the Bayes factor is used and compared with another decision theoretic method which minimizes an expected loss function on the predictive space. A classical model choice technique based on a modified likelihood ratio test statistic is shown as one component of the second criterion. As a consequence the Bayesian methods proposed in this paper are contrasted with the classical approach based on the likelihood ratio test. Several examples are given to illustrate the methods.  相似文献   

18.
We describe a novel stochastic search algorithm for rapidly identifying regions of high posterior probability in the space of decomposable, graphical and hierarchical log-linear models. Our approach is based on the Diaconis–Ylvisaker conjugate prior for log-linear parameters. We discuss the computation of Bayes factors through Laplace approximations and the Bayesian iterative proportional fitting algorithm for sampling model parameters. We use our model determination approach in a sparse eight-way contingency table.  相似文献   

19.
Area‐level unmatched sampling and linking models have been widely used as a model‐based method for producing reliable estimates of small‐area means. However, one practical difficulty is the specification of a link function. In this paper, we relax the assumption of a known link function by not specifying its form and estimating it from the data. A penalized‐spline method is adopted for estimating the link function, and a hierarchical Bayes method of estimating area means is developed using a Markov chain Monte Carlo method for posterior computations. Results of simulation studies comparing the proposed method with a conventional approach based on a known link function are presented. In addition, the proposed method is applied to data from the Survey of Family Income and Expenditure in Japan and poverty rates in Spanish provinces.  相似文献   

20.
Many study designs yield a variety of outcomes from each subject clustered within an experimental unit. When these outcomes are of mixed data types, it is challenging to jointly model the effects of covariates on the responses using traditional methods. In this paper, we develop a Bayesian approach for a joint regression model of the different outcome variables and show that the fully conditional posterior distributions obtained under the model assumptions allow for estimation of posterior distributions using Gibbs sampling algorithm.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号