首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper presents a Bayesian analysis of partially linear additive models for quantile regression. We develop a semiparametric Bayesian approach to quantile regression models using a spectral representation of the nonparametric regression functions and the Dirichlet process (DP) mixture for error distribution. We also consider Bayesian variable selection procedures for both parametric and nonparametric components in a partially linear additive model structure based on the Bayesian shrinkage priors via a stochastic search algorithm. Based on the proposed Bayesian semiparametric additive quantile regression model referred to as BSAQ, the Bayesian inference is considered for estimation and model selection. For the posterior computation, we design a simple and efficient Gibbs sampler based on a location-scale mixture of exponential and normal distributions for an asymmetric Laplace distribution, which facilitates the commonly used collapsed Gibbs sampling algorithms for the DP mixture models. Additionally, we discuss the asymptotic property of the sempiparametric quantile regression model in terms of consistency of posterior distribution. Simulation studies and real data application examples illustrate the proposed method and compare it with Bayesian quantile regression methods in the literature.  相似文献   

2.
Shi  Yushu  Laud  Purushottam  Neuner  Joan 《Lifetime data analysis》2021,27(1):156-176

In this paper, we first propose a dependent Dirichlet process (DDP) model using a mixture of Weibull models with each mixture component resembling a Cox model for survival data. We then build a Dirichlet process mixture model for competing risks data without regression covariates. Next we extend this model to a DDP model for competing risks regression data by using a multiplicative covariate effect on subdistribution hazards in the mixture components. Though built on proportional hazards (or subdistribution hazards) models, the proposed nonparametric Bayesian regression models do not require the assumption of constant hazard (or subdistribution hazard) ratio. An external time-dependent covariate is also considered in the survival model. After describing the model, we discuss how both cause-specific and subdistribution hazard ratios can be estimated from the same nonparametric Bayesian model for competing risks regression. For use with the regression models proposed, we introduce an omnibus prior that is suitable when little external information is available about covariate effects. Finally we compare the models’ performance with existing methods through simulations. We also illustrate the proposed competing risks regression model with data from a breast cancer study. An R package “DPWeibull” implementing all of the proposed methods is available at CRAN.

  相似文献   

3.
ABSTRACT

Nowadays, generalized linear models have many applications. Some of these models which have more applications in the real world are the models with random effects; that is, some of the unknown parameters are considered random variables. In this article, this situation is considered in logistic regression models with a random intercept having exponential distribution. The aim is to obtain the Bayesian D-optimal design; thus, the method is to maximize the Bayesian D-optimal criterion. For the model was considered here, this criterion is a function of the quasi-information matrix that depends on the unknown parameters of the model. In the Bayesian D-optimal criterion, the expectation is acquired in respect of the prior distributions that are considered for the unknown parameters. Thus, it will only be a function of experimental settings (support points) and their weights. The prior distribution of the fixed parameters is considered uniform and normal. The Bayesian D-optimal design is finally calculated numerically by R3.1.1 software.  相似文献   

4.
We will pursue a Bayesian nonparametric approach in the hierarchical mixture modelling of lifetime data in two situations: density estimation, when the distribution is a mixture of parametric densities with a nonparametric mixing measure, and accelerated failure time (AFT) regression modelling, when the same type of mixture is used for the distribution of the error term. The Dirichlet process is a popular choice for the mixing measure, yielding a Dirichlet process mixture model for the error; as an alternative, we also allow the mixing measure to be equal to a normalized inverse-Gaussian prior, built from normalized inverse-Gaussian finite dimensional distributions, as recently proposed in the literature. Markov chain Monte Carlo techniques will be used to estimate the predictive distribution of the survival time, along with the posterior distribution of the regression parameters. A comparison between the two models will be carried out on the grounds of their predictive power and their ability to identify the number of components in a given mixture density.  相似文献   

5.
Bayesian nonparametric methods have been applied to survival analysis problems since the emergence of the area of Bayesian nonparametrics. However, the use of the flexible class of Dirichlet process mixture models has been rather limited in this context. This is, arguably, to a large extent, due to the standard way of fitting such models that precludes full posterior inference for many functionals of interest in survival analysis applications. To overcome this difficulty, we provide a computational approach to obtain the posterior distribution of general functionals of a Dirichlet process mixture. We model the survival distribution employing a flexible Dirichlet process mixture, with a Weibull kernel, that yields rich inference for several important functionals. In the process, a method for hazard function estimation emerges. Methods for simulation-based model fitting, in the presence of censoring, and for prior specification are provided. We illustrate the modeling approach with simulated and real data.  相似文献   

6.
For the first time, a five-parameter distribution, called the Kumaraswamy Burr XII (KwBXII) distribution, is defined and studied. The new distribution contains as special models some well-known distributions discussed in lifetime literature, such as the logistic, Weibull and Burr XII distributions, among several others. We obtain the complete moments, incomplete moments, generating and quantile functions, mean deviations, Bonferroni and Lorenz curves and reliability of the KwBXII distribution. We provide two representations for the moments of the order statistics. The method of maximum likelihood and a Bayesian procedure are adopted for estimating the model parameters. For different parameter settings and sample sizes, various simulation studies are performed and compared to the performance of the KwBXII distribution. Three applications to real data sets demonstrate the usefulness of the proposed distribution and that it may attract wider applications in lifetime data analysis.  相似文献   

7.
As the treatments of cancer progress, a certain number of cancers are curable if diagnosed early. In population‐based cancer survival studies, cure is said to occur when mortality rate of the cancer patients returns to the same level as that expected for the general cancer‐free population. The estimates of cure fraction are of interest to both cancer patients and health policy makers. Mixture cure models have been widely used because the model is easy to interpret by separating the patients into two distinct groups. Usually parametric models are assumed for the latent distribution for the uncured patients. The estimation of cure fraction from the mixture cure model may be sensitive to misspecification of latent distribution. We propose a Bayesian approach to mixture cure model for population‐based cancer survival data, which can be extended to county‐level cancer survival data. Instead of modeling the latent distribution by a fixed parametric distribution, we use a finite mixture of the union of the lognormal, loglogistic, and Weibull distributions. The parameters are estimated using the Markov chain Monte Carlo method. Simulation study shows that the Bayesian method using a finite mixture latent distribution provides robust inference of parameter estimates. The proposed Bayesian method is applied to relative survival data for colon cancer patients from the Surveillance, Epidemiology, and End Results (SEER) Program to estimate the cure fractions. The Canadian Journal of Statistics 40: 40–54; 2012 © 2012 Statistical Society of Canada  相似文献   

8.
Abstract

We construct a new bivariate mixture of negative binomial distributions which represents over-dispersed data more efficiently. This is an extension of a univariate mixture of beta and negative binomial distributions. Characteristics of this joint distribution are studied including conditional distributions. Some properties of the correlation coefficient are explored. We demonstrate the applicability of our proposed model by fitting to three real data sets with correlated count data. A comparison is made with some previously used models to show the effectiveness of the new model.  相似文献   

9.
Abstract

Frailty models are used in survival analysis to account for unobserved heterogeneity in individual risks to disease and death. To analyze bivariate data on related survival times (e.g., matched pairs experiments, twin, or family data), shared frailty models were suggested. Shared frailty models are frequently used to model heterogeneity in survival analysis. The most common shared frailty model is a model in which hazard function is a product of random factor(frailty) and baseline hazard function which is common to all individuals. There are certain assumptions about the baseline distribution and distribution of frailty. In this paper, we introduce shared gamma frailty models with reversed hazard rate. We introduce Bayesian estimation procedure using Markov Chain Monte Carlo (MCMC) technique to estimate the parameters involved in the model. We present a simulation study to compare the true values of the parameters with the estimated values. Also, we apply the proposed model to the Australian twin data set.  相似文献   

10.
In modelling financial return time series and time-varying volatility, the Gaussian and the Student-t distributions are widely used in stochastic volatility (SV) models. However, other distributions such as the Laplace distribution and generalized error distribution (GED) are also common in SV modelling. Therefore, this paper proposes the use of the generalized t (GT) distribution whose special cases are the Gaussian distribution, Student-t distribution, Laplace distribution and GED. Since the GT distribution is a member of the scale mixture of uniform (SMU) family of distribution, we handle the GT distribution via its SMU representation. We show this SMU form can substantially simplify the Gibbs sampler for Bayesian simulation-based computation and can provide a mean of identifying outliers. In an empirical study, we adopt a GT–SV model to fit the daily return of the exchange rate of Australian dollar to three other currencies and use the exchange rate to US dollar as a covariate. Model implementation relies on Bayesian Markov chain Monte Carlo algorithms using the WinBUGS package.  相似文献   

11.
In this paper we consider a nonparametric regression model in which the conditional variance function is assumed to vary smoothly with the predictor. We offer an easily implemented and fully Bayesian approach that involves the Markov chain Monte Carlo sampling of standard distributions. This method is based on a technique utilized by Kim, Shephard, and Chib (in Rev. Econ. Stud. 65:361–393, 1998) for the stochastic volatility model. Although the (parametric or nonparametric) heteroscedastic regression and stochastic volatility models are quite different, they share the same structure as far as the estimation of the conditional variance function is concerned, a point that has been previously overlooked. Our method can be employed in the frequentist context and in Bayesian models more general than those considered in this paper. Illustrations of the method are provided.  相似文献   

12.
The design parameters of the economic and economic statistical designs of control charts depend on the distribution of process failure mechanism or shock model. So far, only a small number of failure distributions, such as exponential, gamma, and Weibull with fixed or increasing hazard rates, have been used as a shock model in the economic and economic statistical designs of the Hotelling T2 control charts. Due to both theoretical and practical aspects, the lifetime of the process under study may not follow a distribution with fixed or increasing hazard rate. A proper alternative for this situation may be the Burr distribution, in which the hazard rate can be fixed, increasing, decreasing, single mode, or even U-shaped. In this research article, economic and economic statistical designs of the Hotelling T2 control charts under the Burr XII shock models under two uniform and non uniform sampling schemes were proposed, constructed, and compared. The obtained design models were implemented by a numerical example, and a sensitivity analysis was conducted to evaluate the effect of changing parameters of shock model distribution on the optimum values of the proposed design models. The results showed that first the proposed designs under non uniform sampling scheme perform better and second the optimum values of the designs are not significantly sensitive to changing of the Burr XII distribution parameters. We showed that the obtained design models are also true for the beta Burr XII shock model.  相似文献   

13.
This study takes up inference in linear models with generalized error and generalized t distributions. For the generalized error distribution, two computational algorithms are proposed. The first is based on indirect Bayesian inference using an approximating finite scale mixture of normal distributions. The second is based on Gibbs sampling. The Gibbs sampler involves only drawing random numbers from standard distributions. This is important because previously the impression has been that an exact analysis of the generalized error regression model using Gibbs sampling is not possible. Next, we describe computational Bayesian inference for linear models with generalized t disturbances based on Gibbs sampling, and exploiting the fact that the model is a mixture of generalized error distributions with inverse generalized gamma distributions for the scale parameter. The linear model with this specification has also been thought not to be amenable to exact Bayesian analysis. All computational methods are applied to actual data involving the exchange rates of the British pound, the French franc, and the German mark relative to the U.S. dollar.  相似文献   

14.
Abstract

This paper deals with Bayesian estimation and prediction for the inverse Weibull distribution with shape parameter α and scale parameter λ under general progressive censoring. We prove that the posterior conditional density functions of α and λ are both log-concave based on the assumption that λ has a gamma prior distribution and α follows a prior distribution with log-concave density. Then, we present the Gibbs sampling strategy to estimate under squared-error loss any function of the unknown parameter vector (α, λ) and find credible intervals, as well as to obtain prediction intervals for future order statistics. Monte Carlo simulations are given to compare the performance of Bayesian estimators derived via Gibbs sampling with the corresponding maximum likelihood estimators, and a real data analysis is discussed in order to illustrate the proposed procedure. Finally, we extend the developed methodology to other two-parameter distributions, including the Weibull, Burr type XII, and flexible Weibull distributions, and also to general progressive hybrid censoring.  相似文献   

15.
Quantile regression, including median regression, as a more completed statistical model than mean regression, is now well known with its wide spread applications. Bayesian inference on quantile regression or Bayesian quantile regression has attracted much interest recently. Most of the existing researches in Bayesian quantile regression focus on parametric quantile regression, though there are discussions on different ways of modeling the model error by a parametric distribution named asymmetric Laplace distribution or by a nonparametric alternative named scale mixture asymmetric Laplace distribution. This paper discusses Bayesian inference for nonparametric quantile regression. This general approach fits quantile regression curves using piecewise polynomial functions with an unknown number of knots at unknown locations, all treated as parameters to be inferred through reversible jump Markov chain Monte Carlo (RJMCMC) of Green (Biometrika 82:711–732, 1995). Instead of drawing samples from the posterior, we use regression quantiles to create Markov chains for the estimation of the quantile curves. We also use approximate Bayesian factor in the inference. This method extends the work in automatic Bayesian mean curve fitting to quantile regression. Numerical results show that this Bayesian quantile smoothing technique is competitive with quantile regression/smoothing splines of He and Ng (Comput. Stat. 14:315–337, 1999) and P-splines (penalized splines) of Eilers and de Menezes (Bioinformatics 21(7):1146–1153, 2005).  相似文献   

16.
This article addresses the density estimation problem using nonparametric Bayesian approach. It is considered hierarchical mixture models where the uncertainty about the mixing measure is modeled using the Dirichlet process. The main goal is to build more flexible models for density estimation. Extensions of the normal mixture model via Dirichlet process previously introduced in the literature are twofold. First, Dirichlet mixtures of skew-normal distributions are considered, say, in the first stage of the hierarchical model, the normal distribution is replaced by the skew-normal one. We also assume a skew-normal distribution as the center measure in the Dirichlet mixture of normal distributions. Some important results related to Bayesian inference in the location-scale skew-normal family are introduced. In particular, we obtain the stochastic representations for the full conditional distributions of the location and skewness parameters. The algorithm introduced by MacEachern and Müller in 1998 MacEachern, S.N., Müller, P. (1998). Estimating mixture of Dirichlet Process models. J. Computat. Graph. Statist. 7(2):223238.[Taylor & Francis Online], [Web of Science ®] [Google Scholar] is used to sample from the posterior distributions. The models are compared considering simulated data sets. Finally, the well-known Old Faithful Geyser data set is analyzed using the proposed models and the Dirichlet mixture of normal distributions. The model based on Dirichlet mixture of skew-normal distributions captured the data bimodality and skewness shown in the empirical distribution.  相似文献   

17.
ABSTRACT

In this article, main characteristics of a generalized Gumbel (GG) distribution are derived. Parameter estimation with method of moments, maximum likelihood, and Bayesian approaches are demonstrated. Due to the ranges of its skewness and kurtosis, it is satisfactory for fitting a wide variety of datasets. Also, it can be used to model block maxima or minima data due to its close connection with the standard Gumbel distribution. It is demonstrated that the GG distribution fits more accurately than both of the standard Gumbel and generalized extreme value distributions to block maxima data under specific conditions.  相似文献   

18.
Lin  Tsung I.  Lee  Jack C.  Ni  Huey F. 《Statistics and Computing》2004,14(2):119-130
A finite mixture model using the multivariate t distribution has been shown as a robust extension of normal mixtures. In this paper, we present a Bayesian approach for inference about parameters of t-mixture models. The specifications of prior distributions are weakly informative to avoid causing nonintegrable posterior distributions. We present two efficient EM-type algorithms for computing the joint posterior mode with the observed data and an incomplete future vector as the sample. Markov chain Monte Carlo sampling schemes are also developed to obtain the target posterior distribution of parameters. The advantages of Bayesian approach over the maximum likelihood method are demonstrated via a set of real data.  相似文献   

19.
ABSTRACT

The shared frailty models are often used to model heterogeneity in survival analysis. The most common shared frailty model is a model in which hazard function is a product of a random factor (frailty) and the baseline hazard function which is common to all individuals. There are certain assumptions about the baseline distribution and the distribution of frailty. In this paper, we consider inverse Gaussian distribution as frailty distribution and three different baseline distributions, namely the generalized Rayleigh, the weighted exponential, and the extended Weibull distributions. With these three baseline distributions, we propose three different inverse Gaussian shared frailty models. We also compare these models with the models where the above-mentioned distributions are considered without frailty. We develop the Bayesian estimation procedure using Markov Chain Monte Carlo (MCMC) technique to estimate the parameters involved in these models. We present a simulation study to compare the true values of the parameters with the estimated values. A search of the literature suggests that currently no work has been done for these three baseline distributions with a shared inverse Gaussian frailty so far. We also apply these three models by using a real-life bivariate survival data set of McGilchrist and Aisbett (1991 McGilchrist, C.A., Aisbett, C.W. (1991). Regression with frailty in survival analysis. Biometrics 47:461466.[Crossref], [PubMed], [Web of Science ®] [Google Scholar]) related to the kidney infection data and a better model is suggested for the data using the Bayesian model selection criteria.  相似文献   

20.
We introduce the Hausdorff αα-entropy to study the strong Hellinger consistency of posterior distributions. We obtain general Bayesian consistency theorems which extend the well-known results of Barron et al. [1999. The consistency of posterior distributions in nonparametric problems. Ann. Statist. 27, 536–561] and Ghosal et al. [1999. Posterior consistency of Dirichlet mixtures in density estimation. Ann. Statist. 27, 143–158] and Walker [2004. New approaches to Bayesian consistency. Ann. Statist. 32, 2028–2043]. As an application we strengthen previous results on Bayesian consistency of the (normal) mixture models.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号