首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
This paper develops a novel and efficient algorithm for Bayesian inference in inverse Gamma stochastic volatility models. It is shown that by conditioning on auxiliary variables, it is possible to sample all the volatilities jointly directly from their posterior conditional density, using simple and easy to draw from distributions. Furthermore, this paper develops a generalized inverse gamma process with more flexible tails in the distribution of volatilities, which still allows for simple and efficient calculations. Using several macroeconomic and financial datasets, it is shown that the inverse gamma and generalized inverse gamma processes can greatly outperform the commonly used log normal volatility processes with Student’s t errors or jumps in the mean equation.  相似文献   

3.
Often the dependence in multivariate survival data is modeled through an individual level effect called the frailty. Due to its mathematical simplicity, the gamma distribution is often used as the frailty distribution for hazard modeling. However, it is well known that the gamma frailty distribution has many drawbacks. For example, it weakens the effect of covariates. In addition, in the presence of a multilevel model, overall frailty comes from several levels. To overcome such drawbacks, more heavy-tailed distributions are needed to model the frailty distribution in order to incorporate extra variability. In this article, we develop a class of log-skew-t distributions for the frailty. This class includes the log-normal distribution along with many other heavy tailed distributions, e.g., log-Cauchy, log normal, and log-t as special cases.

Conditional on the frailty, the survival times are assumed to be independent with proportional hazard structure. The modeling process is then completed by assuming multilevel frailty-effects. Instead of tuning a strict parameterization of the baseline hazard function, we consider the partial likelihood approach and thus leave the baseline function unspecified. By eliminating the hazard, the pre-specification and computation are simplified considerably.  相似文献   

4.
This study takes up inference in linear models with generalized error and generalized t distributions. For the generalized error distribution, two computational algorithms are proposed. The first is based on indirect Bayesian inference using an approximating finite scale mixture of normal distributions. The second is based on Gibbs sampling. The Gibbs sampler involves only drawing random numbers from standard distributions. This is important because previously the impression has been that an exact analysis of the generalized error regression model using Gibbs sampling is not possible. Next, we describe computational Bayesian inference for linear models with generalized t disturbances based on Gibbs sampling, and exploiting the fact that the model is a mixture of generalized error distributions with inverse generalized gamma distributions for the scale parameter. The linear model with this specification has also been thought not to be amenable to exact Bayesian analysis. All computational methods are applied to actual data involving the exchange rates of the British pound, the French franc, and the German mark relative to the U.S. dollar.  相似文献   

5.
Lin  Tsung I.  Lee  Jack C.  Ni  Huey F. 《Statistics and Computing》2004,14(2):119-130
A finite mixture model using the multivariate t distribution has been shown as a robust extension of normal mixtures. In this paper, we present a Bayesian approach for inference about parameters of t-mixture models. The specifications of prior distributions are weakly informative to avoid causing nonintegrable posterior distributions. We present two efficient EM-type algorithms for computing the joint posterior mode with the observed data and an incomplete future vector as the sample. Markov chain Monte Carlo sampling schemes are also developed to obtain the target posterior distribution of parameters. The advantages of Bayesian approach over the maximum likelihood method are demonstrated via a set of real data.  相似文献   

6.
As is the case of many studies, the data collected are limited and an exact value is recorded only if it falls within an interval range. Hence, the responses can be either left, interval or right censored. Linear (and nonlinear) regression models are routinely used to analyze these types of data and are based on normality assumptions for the errors terms. However, those analyzes might not provide robust inference when the normality assumptions are questionable. In this article, we develop a Bayesian framework for censored linear regression models by replacing the Gaussian assumptions for the random errors with scale mixtures of normal (SMN) distributions. The SMN is an attractive class of symmetric heavy-tailed densities that includes the normal, Student-t, Pearson type VII, slash and the contaminated normal distributions, as special cases. Using a Bayesian paradigm, an efficient Markov chain Monte Carlo algorithm is introduced to carry out posterior inference. A new hierarchical prior distribution is suggested for the degrees of freedom parameter in the Student-t distribution. The likelihood function is utilized to compute not only some Bayesian model selection measures but also to develop Bayesian case-deletion influence diagnostics based on the q-divergence measure. The proposed Bayesian methods are implemented in the R package BayesCR. The newly developed procedures are illustrated with applications using real and simulated data.  相似文献   

7.
Markov Beta and Gamma Processes for Modelling Hazard Rates   总被引:1,自引:0,他引:1  
This paper generalizes the discrete time independent increment beta process of Hjort (1990 ), for modelling discrete failure times, and also generalizes the independent gamma process for modelling piecewise constant hazard rates ( Walker and Mallick, 1997 ). The generalizations are from independent increment to Markov increment prior processes allowing the modelling of smoothness. We derive posterior distributions and undertake a full Bayesian analysis.  相似文献   

8.
In this paper, we proposed a new family of distributions namely exponentiated exponential–geometric (E2G) distribution. The E2G distribution is a straightforwardly generalization of the exponential–geometric (EG) distribution proposed by Adamidis and Loukas [A lifetime distribution with decreasing failure rate, Statist. Probab. Lett. 39 (1998), pp. 35–42], which accommodates increasing, decreasing and unimodal hazard functions. It arises on a latent competing risk scenarios, where the lifetime associated with a particular risk is not observable but only the minimum lifetime value among all risks. The properties of the proposed distribution are discussed, including a formal proof of its probability density function and explicit algebraic formulas for its survival and hazard functions, moments, rth moment of the ith order statistic, mean residual lifetime and modal value. Maximum-likelihood inference is implemented straightforwardly. From a mis-specification simulation study performed in order to assess the extent of the mis-specification errors when testing the EG distribution against the E2G, and we observed that it is usually possible to discriminate between both distributions even for moderate samples with presence of censoring. The practical importance of the new distribution was demonstrated in three applications where we compare the E2G distribution with several lifetime distributions.  相似文献   

9.
We consider a Bayesian analysis method of paired survival data using a bivariate exponential model proposed by Moran (1967, Biometrika 54:385–394). Important features of Moran’s model include that the marginal distributions are exponential and the range of the correlation coefficient is between 0 and 1. These contrast with the popular exponential model with gamma frailty. Despite these nice properties, statistical analysis with Moran’s model has been hampered by lack of a closed form likelihood function. In this paper, we introduce a latent variable to circumvent the difficulty in the Bayesian computation. We also consider a model checking procedure using the predictive Bayesian P-value.  相似文献   

10.
Abstract. We investigate simulation methodology for Bayesian inference in Lévy‐driven stochastic volatility (SV) models. Typically, Bayesian inference from such models is performed using Markov chain Monte Carlo (MCMC); this is often a challenging task. Sequential Monte Carlo (SMC) samplers are methods that can improve over MCMC; however, there are many user‐set parameters to specify. We develop a fully automated SMC algorithm, which substantially improves over the standard MCMC methods in the literature. To illustrate our methodology, we look at a model comprised of a Heston model with an independent, additive, variance gamma process in the returns equation. The driving gamma process can capture the stylized behaviour of many financial time series and a discretized version, fit in a Bayesian manner, has been found to be very useful for modelling equity data. We demonstrate that it is possible to draw exact inference, in the sense of no time‐discretization error, from the Bayesian SV model.  相似文献   

11.
The paper considers Bayesian analysis of the generalized four-parameter gamma distribution. Estimation of parameters using classical techniques is associated with important technical problems while Bayesian methods are not currently available for such distributions. Posterior inference is performed using numerical methods organized around Gibbs sampling. Predictive distributions and reliability can be estimated routinely using the proposed methods.  相似文献   

12.
We propose a Bayesian approach for estimating the hazard functions under the constraint of a monotone hazard ratio. We construct a model for the monotone hazard ratio utilizing the Cox’s proportional hazards model with a monotone time-dependent coefficient. To reduce computational complexity, we use a signed gamma process prior for the time-dependent coefficient and the Bayesian bootstrap prior for the baseline hazard function. We develope an efficient MCMC algorithm and illustrate the proposed method on simulated and real data sets.  相似文献   

13.
Likelihood-free methods such as approximate Bayesian computation (ABC) have extended the reach of statistical inference to problems with computationally intractable likelihoods. Such approaches perform well for small-to-moderate dimensional problems, but suffer a curse of dimensionality in the number of model parameters. We introduce a likelihood-free approximate Gibbs sampler that naturally circumvents the dimensionality issue by focusing on lower-dimensional conditional distributions. These distributions are estimated by flexible regression models either before the sampler is run, or adaptively during sampler implementation. As a result, and in comparison to Metropolis-Hastings-based approaches, we are able to fit substantially more challenging statistical models than would otherwise be possible. We demonstrate the sampler’s performance via two simulated examples, and a real analysis of Airbnb rental prices using a intractable high-dimensional multivariate nonlinear state-space model with a 36-dimensional latent state observed on 365 time points, which presents a real challenge to standard ABC techniques.  相似文献   

14.
This paper presents a Bayesian analysis of partially linear additive models for quantile regression. We develop a semiparametric Bayesian approach to quantile regression models using a spectral representation of the nonparametric regression functions and the Dirichlet process (DP) mixture for error distribution. We also consider Bayesian variable selection procedures for both parametric and nonparametric components in a partially linear additive model structure based on the Bayesian shrinkage priors via a stochastic search algorithm. Based on the proposed Bayesian semiparametric additive quantile regression model referred to as BSAQ, the Bayesian inference is considered for estimation and model selection. For the posterior computation, we design a simple and efficient Gibbs sampler based on a location-scale mixture of exponential and normal distributions for an asymmetric Laplace distribution, which facilitates the commonly used collapsed Gibbs sampling algorithms for the DP mixture models. Additionally, we discuss the asymptotic property of the sempiparametric quantile regression model in terms of consistency of posterior distribution. Simulation studies and real data application examples illustrate the proposed method and compare it with Bayesian quantile regression methods in the literature.  相似文献   

15.
Jaeyong Lee 《Statistics》2013,47(6):515-526
Clustered survival data are often modelled with frailty models which incorporate frailties to model the cluster specific heterogeneity and the dependence between observations in the same cluster. For the analysis of the frailty models, we propose Bayesian modelling with beta process prior on the cumulative hazard function and describe the details of the posterior computation. We demonstrate the method with two data sets using three different frailty distributions: gamma, log-normal and log-logistic distributions. We also empirically demonstrate the difficulty in checking the assumed frailty distribution with the posterior sample of the frailties.  相似文献   

16.
Abstract.  Mixed model based approaches for semiparametric regression have gained much interest in recent years, both in theory and application. They provide a unified and modular framework for penalized likelihood and closely related empirical Bayes inference. In this article, we develop mixed model methodology for a broad class of Cox-type hazard regression models where the usual linear predictor is generalized to a geoadditive predictor incorporating non-parametric terms for the (log-)baseline hazard rate, time-varying coefficients and non-linear effects of continuous covariates, a spatial component, and additional cluster-specific frailties. Non-linear and time-varying effects are modelled through penalized splines, while spatial components are treated as correlated random effects following either a Markov random field or a stationary Gaussian random field prior. Generalizing existing mixed model methodology, inference is derived using penalized likelihood for regression coefficients and (approximate) marginal likelihood for smoothing parameters. In a simulation we study the performance of the proposed method, in particular comparing it with its fully Bayesian counterpart using Markov chain Monte Carlo methodology, and complement the results by some asymptotic considerations. As an application, we analyse leukaemia survival data from northwest England.  相似文献   

17.
Bivariate count data arise in several different disciplines (epidemiology, marketing, sports statistics just to name a few) and the bivariate Poisson distribution being a generalization of the Poisson distribution plays an important role in modelling such data. In the present paper we present a Bayesian estimation approach for the parameters of the bivariate Poisson model and provide the posterior distributions in closed forms. It is shown that the joint posterior distributions are finite mixtures of conditionally independent gamma distributions for which their full form can be easily deduced by a recursively updating scheme. Thus, the need of applying computationally demanding MCMC schemes for Bayesian inference in such models will be removed, since direct sampling from the posterior will become available, even in cases where the posterior distribution of functions of the parameters is not available in closed form. In addition, we define a class of prior distributions that possess an interesting conjugacy property which extends the typical notion of conjugacy, in the sense that both prior and posteriors belong to the same family of finite mixture models but with different number of components. Extension to certain other models including multivariate models or models with other marginal distributions are discussed.  相似文献   

18.
Recently, mixture distribution becomes more and more popular in many scientific fields. Statistical computation and analysis of mixture models, however, are extremely complex due to the large number of parameters involved. Both EM algorithms for likelihood inference and MCMC procedures for Bayesian analysis have various difficulties in dealing with mixtures with unknown number of components. In this paper, we propose a direct sampling approach to the computation of Bayesian finite mixture models with varying number of components. This approach requires only the knowledge of the density function up to a multiplicative constant. It is easy to implement, numerically efficient and very practical in real applications. A simulation study shows that it performs quite satisfactorily on relatively high dimensional distributions. A well-known genetic data set is used to demonstrate the simplicity of this method and its power for the computation of high dimensional Bayesian mixture models.  相似文献   

19.
Sinh-normal/independent distributions are a class of symmetric heavy-tailed distributions that include the sinh-normal distribution as a special case, which has been used extensively in Birnbaum–Saunders regression models. Here, we explore the use of Markov Chain Monte Carlo methods to develop a Bayesian analysis in nonlinear regression models when Sinh-normal/independent distributions are assumed for the random errors term, and it provides a robust alternative to the sinh-normal nonlinear regression model. Bayesian mechanisms for parameter estimation, residual analysis and influence diagnostics are then developed, which extend the results of Farias and Lemonte [Bayesian inference for the Birnbaum-Saunders nonlinear regression model, Stat. Methods Appl. 20 (2011), pp. 423-438] who used the Sinh-normal/independent distributions with known scale parameter. Some special cases, based on the sinh-Student-t (sinh-St), sinh-slash (sinh-SL) and sinh-contaminated normal (sinh-CN) distributions are discussed in detail. Two real datasets are finally analyzed to illustrate the developed procedures.  相似文献   

20.
This paper presents a new Bayesian, infinite mixture model based, clustering approach, specifically designed for time-course microarray data. The problem is to group together genes which have “similar” expression profiles, given the set of noisy measurements of their expression levels over a specific time interval. In order to capture temporal variations of each curve, a non-parametric regression approach is used. Each expression profile is expanded over a set of basis functions and the sets of coefficients of each curve are subsequently modeled through a Bayesian infinite mixture of Gaussian distributions. Therefore, the task of finding clusters of genes with similar expression profiles is then reduced to the problem of grouping together genes whose coefficients are sampled from the same distribution in the mixture. Dirichlet processes prior is naturally employed in such kinds of models, since it allows one to deal automatically with the uncertainty about the number of clusters. The posterior inference is carried out by a split and merge MCMC sampling scheme which integrates out parameters of the component distributions and updates only the latent vector of the cluster membership. The final configuration is obtained via the maximum a posteriori estimator. The performance of the method is studied using synthetic and real microarray data and is compared with the performances of competitive techniques.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号