首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 14 毫秒
1.
Abstract.  In this paper we propose fast approximate methods for computing posterior marginals in spatial generalized linear mixed models. We consider the common geostatistical case with a high dimensional latent spatial variable and observations at known registration sites. The methods of inference are deterministic, using no simulation-based inference. The first proposed approximation is fast to compute and is 'practically sufficient', meaning that results do not show any bias or dispersion effects that might affect decision making. Our second approximation, an improvement of the first version, is 'practically exact', meaning that one would have to run MCMC simulations for very much longer than is typically done to detect any indication of error in the approximate results. For small-count data the approximations are slightly worse, but still very accurate. Our methods are limited to likelihood functions that give unimodal full conditionals for the latent variable. The methods help to expand the future scope of non-Gaussian geostatistical models as illustrated by applications of model choice, outlier detection and sampling design. The approximations take seconds or minutes of CPU time, in sharp contrast to overnight MCMC runs for solving such problems.  相似文献   

2.
Summary.  Structured additive regression models are perhaps the most commonly used class of models in statistical applications. It includes, among others, (generalized) linear models, (generalized) additive models, smoothing spline models, state space models, semiparametric regression, spatial and spatiotemporal models, log-Gaussian Cox processes and geostatistical and geoadditive models. We consider approximate Bayesian inference in a popular subset of structured additive regression models, latent Gaussian models , where the latent field is Gaussian, controlled by a few hyperparameters and with non-Gaussian response variables. The posterior marginals are not available in closed form owing to the non-Gaussian response variables. For such models, Markov chain Monte Carlo methods can be implemented, but they are not without problems, in terms of both convergence and computational time. In some practical applications, the extent of these problems is such that Markov chain Monte Carlo sampling is simply not an appropriate tool for routine analysis. We show that, by using an integrated nested Laplace approximation and its simplified version, we can directly compute very accurate approximations to the posterior marginals. The main benefit of these approximations is computational: where Markov chain Monte Carlo algorithms need hours or days to run, our approximations provide more precise estimates in seconds or minutes. Another advantage with our approach is its generality, which makes it possible to perform Bayesian analysis in an automatic, streamlined way, and to compute model comparison criteria and various predictive measures so that models can be compared and the model under study can be challenged.  相似文献   

3.
This work extends the integrated nested Laplace approximation (INLA) method to latent models outside the scope of latent Gaussian models, where independent components of the latent field can have a near‐Gaussian distribution. The proposed methodology is an essential component of a bigger project that aims to extend the R package INLA in order to allow the user to add flexibility and challenge the Gaussian assumptions of some of the model components in a straightforward and intuitive way. Our approach is applied to two examples, and the results are compared with that obtained by Markov chain Monte Carlo, showing similar accuracy with only a small fraction of computational time. Implementation of the proposed extension is available in the R‐INLA package.  相似文献   

4.
Linear mixed models (LMM) are frequently used to analyze repeated measures data, because they are more flexible to modelling the correlation within-subject, often present in this type of data. The most popular LMM for continuous responses assumes that both the random effects and the within-subjects errors are normally distributed, which can be an unrealistic assumption, obscuring important features of the variations present within and among the units (or groups). This work presents skew-normal liner mixed models (SNLMM) that relax the normality assumption by using a multivariate skew-normal distribution, which includes the normal ones as a special case and provides robust estimation in mixed models. The MCMC scheme is derived and the results of a simulation study are provided demonstrating that standard information criteria may be used to detect departures from normality. The procedures are illustrated using a real data set from a cholesterol study.  相似文献   

5.
We consider the issue of sampling from the posterior distribution of exponential random graph (ERG) models and other statistical models with intractable normalizing constants. Existing methods based on exact sampling are either infeasible or require very long computing time. We study a class of approximate Markov chain Monte Carlo (MCMC) sampling schemes that deal with this issue. We also develop a new Metropolis–Hastings kernel to sample sparse large networks from ERG models. We illustrate the proposed methods on several examples.  相似文献   

6.
This study takes up inference in linear models with generalized error and generalized t distributions. For the generalized error distribution, two computational algorithms are proposed. The first is based on indirect Bayesian inference using an approximating finite scale mixture of normal distributions. The second is based on Gibbs sampling. The Gibbs sampler involves only drawing random numbers from standard distributions. This is important because previously the impression has been that an exact analysis of the generalized error regression model using Gibbs sampling is not possible. Next, we describe computational Bayesian inference for linear models with generalized t disturbances based on Gibbs sampling, and exploiting the fact that the model is a mixture of generalized error distributions with inverse generalized gamma distributions for the scale parameter. The linear model with this specification has also been thought not to be amenable to exact Bayesian analysis. All computational methods are applied to actual data involving the exchange rates of the British pound, the French franc, and the German mark relative to the U.S. dollar.  相似文献   

7.
Regularization methods for simultaneous variable selection and coefficient estimation have been shown to be effective in quantile regression in improving the prediction accuracy. In this article, we propose the Bayesian bridge for variable selection and coefficient estimation in quantile regression. A simple and efficient Gibbs sampling algorithm was developed for posterior inference using a scale mixture of uniform representation of the Bayesian bridge prior. This is the first work to discuss regularized quantile regression with the bridge penalty. Both simulated and real data examples show that the proposed method often outperforms quantile regression without regularization, lasso quantile regression, and Bayesian lasso quantile regression.  相似文献   

8.
Abstract.  Much recent methodological progress in the analysis of infectious disease data has been due to Markov chain Monte Carlo (MCMC) methodology. In this paper, it is illustrated that rejection sampling can also be applied to a family of inference problems in the context of epidemic models, avoiding the issues of convergence associated with MCMC methods. Specifically, we consider models for epidemic data arising from a population divided into households. The models allow individuals to be potentially infected both from outside and from within the household. We develop methodology for selection between competing models via the computation of Bayes factors. We also demonstrate how an initial sample can be used to adjust the algorithm and improve efficiency. The data are assumed to consist of the final numbers ultimately infected within a sample of households in some community. The methods are applied to data taken from outbreaks of influenza.  相似文献   

9.
In this article, we propose Bayesian methodology to obtain parameter estimates of the mixture of distributions belonging to the normal and biparametric Weibull families, modeling the mean and the variance parameters. Simulated studies and applications show the performance of the proposed models.  相似文献   

10.
This paper discusses recovery of information regarding logistic regression parameters in cases when maximum likelihood estimates of some parameters are infinite. An algorithm for detecting such cases and characterizing the divergence of the parameter estimates is presented. A method for fitting the remaining parameters is also presented . All of these methods rely only on sufficient statistics rather than less aggregated quantities, as required for inference according to the method of Kolassa & Tanner (1994). These results are applied to approximate conditional inference via saddlepoint methods. Specifically, the double saddlepoint method of Skovgaard (1987) is adapted to the case when the solution to the saddlepoint equations exists as a point at infinity  相似文献   

11.
There has been much recent work on Bayesian approaches to survival analysis, incorporating features such as flexible baseline hazards, time-dependent covariate effects, and random effects. Some of the proposed methods are quite complicated to implement, and we argue that as good or better results can be obtained via simpler methods. In particular, the normal approximation to the log-gamma distribution yields easy and efficient computational methods in the face of simple multivariate normal priors for baseline log-hazards and time-dependent covariate effects. While the basic method applies to piecewise-constant hazards and covariate effects, it is easy to apply importance sampling to consider smoother functions.  相似文献   

12.
A Bayesian approach to modeling a rich class of nonconjugate problems is presented. An adaptive Monte Carlo integration technique known as the Gibbs sampler is proposed as a mechanism for implementing a conceptually and computationally simple solution in such a framework. The result is a general strategy for obtaining marginal posterior densities under changing specification of the model error densities and related prior densities. We illustrate the approach in a nonlinear regression setting, comparing the merits of three candidate error distributions.  相似文献   

13.
Abstract.  The traditional Cox proportional hazards regression model uses an exponential relative risk function. We argue that under various plausible scenarios, the relative risk part of the model should be bounded, suggesting also that the traditional model often might overdramatize the hazard rate assessment for individuals with unusual covariates. This motivates our working with proportional hazards models where the relative risk function takes a logistic form. We provide frequentist methods, based on the partial likelihood, and then go on to semiparametric Bayesian constructions. These involve a Beta process for the cumulative baseline hazard function and any prior with a density, for example that dictated by a Jeffreys-type argument, for the regression coefficients. The posterior is derived using machinery for Lévy processes, and a simulation recipe is devised for sampling from the posterior distribution of any quantity. Our methods are illustrated on real data. A Bernshtĕn–von Mises theorem is reached for our class of semiparametric priors, guaranteeing asymptotic normality of the posterior processes.  相似文献   

14.
We propose a Bayesian approach for inference in a dynamic disequilibrium model. To circumvent the difficulties raised by the Maddala and Nelson (1974) specification in the dynamic case, we analyze a dynamic extended version of the disequilibrium model of Ginsburgh et al. (1980). We develop a Gibbs sampler based on the simulation of the missing observations. The feasibility of the approach is illustrated by an empirical analysis of the Polish credit market, for which we conduct a specification search using the posterior deviance criterion of Spiegelhalter et al. (2002).  相似文献   

15.
We propose a Bayesian approach for inference in a dynamic disequilibrium model. To circumvent the difficulties raised by the Maddala and Nelson (1974 Maddala , G. , Nelson , F. ( 1974 ). Maximum likelihood methods for models of markets in disequilibrium . Econometrica 42 ( 6 ): 10131030 .[Crossref], [Web of Science ®] [Google Scholar]) specification in the dynamic case, we analyze a dynamic extended version of the disequilibrium model of Ginsburgh et al. (1980 Ginsburgh , V. , Tishler , A. , Zang , I. ( 1980 ). Alternative estimation methods for two regime models . European Economic Review 13 : 207228 .[Crossref], [Web of Science ®] [Google Scholar]). We develop a Gibbs sampler based on the simulation of the missing observations. The feasibility of the approach is illustrated by an empirical analysis of the Polish credit market, for which we conduct a specification search using the posterior deviance criterion of Spiegelhalter et al. (2002 Spiegelhalter , D. J. , Best , N. G. , Carlin , B. P. , Van Der Linde , A. ( 2002 ). Bayesian measures of complexity and fit . Journal of the Royal Statistical Society, Ser. B 64 : 583639 .[Crossref] [Google Scholar]).  相似文献   

16.
Quantile regression (QR) models have received increasing attention recently for longitudinal data analysis. When continuous responses appear non-centrality due to outliers and/or heavy-tails, commonly used mean regression models may fail to produce efficient estimators, whereas QR models may perform satisfactorily. In addition, longitudinal outcomes are often measured with non-normality, substantial errors and non-ignorable missing values. When carrying out statistical inference in such data setting, it is important to account for the simultaneous treatment of these data features; otherwise, erroneous or even misleading results may be produced. In the literature, there has been considerable interest in accommodating either one or some of these data features. However, there is relatively little work concerning all of them simultaneously. There is a need to fill up this gap as longitudinal data do often have these characteristics. Inferential procedure can be complicated dramatically when these data features arise in longitudinal response and covariate outcomes. In this article, our objective is to develop QR-based Bayesian semiparametric mixed-effects models to address the simultaneous impact of these multiple data features. The proposed models and method are applied to analyse a longitudinal data set arising from an AIDS clinical study. Simulation studies are conducted to assess the performance of the proposed method under various scenarios.  相似文献   

17.
18.
ABSTRACT

Log-linear models for the distribution on a contingency table are represented as the intersection of only two kinds of log-linear models. One assuming that a certain group of the variables, if conditioned on all other variables, has a jointly independent distribution and another one assuming that a certain group of the variables, if conditioned on all other variables, has no highest order interaction. The subsets entering into these models are uniquely determined by the original log-linear model. This canonical representation suggests considering joint conditional independence and conditional no highest order association as the elementary building blocks of log-linear models.  相似文献   

19.
In this paper, we propose a general class of Gamma frailty transformation models for multivariate survival data. The transformation class includes the commonly used proportional hazards and proportional odds models. The proposed class also includes a family of cure rate models. Under an improper prior for the parameters, we establish propriety of the posterior distribution. A novel Gibbs sampling algorithm is developed for sampling from the observed data posterior distribution. A simulation study is conducted to examine the properties of the proposed methodology. An application to a data set from a cord blood transplantation study is also reported.  相似文献   

20.
We consider approximate Bayesian inference about scalar parameters of linear regression models with possible censoring. A second-order expansion of their Laplace posterior is seen to have a simple and intuitive form for logconcave error densities with nondecreasing hazard functions. The accuracy of the approximations is assessed for normal and Gumbel errors when the number of regressors increases with sample size. Perturbations of the prior and the likelihood are seen to be easily accommodated within our framework. Links with the work of DiCiccio et al. (1990) and Viveros and Sprott (1987) extend the applicability of our results to conditional frequentist inference based on likelihood-ratio statistics.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号