首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 38 毫秒
1.
The aim of this paper is to explore variable selection approaches in the partially linear proportional hazards model for multivariate failure time data. A new penalised pseudo-partial likelihood method is proposed to select important covariates. Under certain regularity conditions, we establish the rate of convergence and asymptotic normality of the resulting estimates. We further show that the proposed procedure can correctly select the true submodel, as if it was known in advance. Both simulated and real data examples are presented to illustrate the proposed methodology.  相似文献   

2.
Lifetime Data Analysis - Cox’s proportional hazards regression model is the standard method for modelling censored life-time data with covariates. In its standard form, this method relies on...  相似文献   

3.
We propose the penalized empirical likelihood method via bridge estimator in Cox's proportional hazard model for parameter estimation and variable selection. Under reasonable conditions, we show that penalized empirical likelihood in Cox's proportional hazard model has oracle property. A penalized empirical likelihood ratio for the vector of regression coefficients is defined and its limiting distribution is a chi-square distributions. The advantage of penalized empirical likelihood as a nonparametric likelihood approach is illustrated in testing hypothesis and constructing confidence sets. The method is illustrated by extensive simulation studies and a real example.  相似文献   

4.
A new test of the proportional hazards assumption in the Cox model is proposed. The idea is based on Neyman’s smooth tests. The Cox model with proportional hazards (i.e. time-constant covariate effects) is embedded in a model with a smoothly time-varying covariate effect that is expressed as a combination of some basis functions (e.g., Legendre polynomials, cosines). Then the smooth test is the score test for significance of these artificial covariates. Furthermore, we apply a modification of Schwarz’s selection rule to choosing the dimension of the smooth model (the number of the basis functions). The score test is then used in the selected model. In a simulation study, we compare the proposed tests with standard tests based on the score process.  相似文献   

5.
When variable selection with stepwise regression and model fitting are conducted on the same data set, competition for inclusion in the model induces a selection bias in coefficient estimators away from zero. In proportional hazards regression with right-censored data, selection bias inflates the absolute value of parameter estimate of selected parameters, while the omission of other variables may shrink coefficients toward zero. This paper explores the extent of the bias in parameter estimates from stepwise proportional hazards regression and proposes a bootstrap method, similar to those proposed by Miller (Subset Selection in Regression, 2nd edn. Chapman & Hall/CRC, 2002) for linear regression, to correct for selection bias. We also use bootstrap methods to estimate the standard error of the adjusted estimators. Simulation results show that substantial biases could be present in uncorrected stepwise estimators and, for binary covariates, could exceed 250% of the true parameter value. The simulations also show that the conditional mean of the proposed bootstrap bias-corrected parameter estimator, given that a variable is selected, is moved closer to the unconditional mean of the standard partial likelihood estimator in the chosen model, and to the population value of the parameter. We also explore the effect of the adjustment on estimates of log relative risk, given the values of the covariates in a selected model. The proposed method is illustrated with data sets in primary biliary cirrhosis and in multiple myeloma from the Eastern Cooperative Oncology Group.  相似文献   

6.
In this paper, we develop Bayesian methodology and computational algorithms for variable subset selection in Cox proportional hazards models with missing covariate data. A new joint semi-conjugate prior for the piecewise exponential model is proposed in the presence of missing covariates and its properties are examined. The covariates are assumed to be missing at random (MAR). Under this new prior, a version of the Deviance Information Criterion (DIC) is proposed for Bayesian variable subset selection in the presence of missing covariates. Monte Carlo methods are developed for computing the DICs for all possible subset models in the model space. A Bone Marrow Transplant (BMT) dataset is used to illustrate the proposed methodology.  相似文献   

7.
Case-cohort designs are commonly used in large epidemiological studies to reduce the cost associated with covariate measurement. In many such studies the number of covariates is very large. An efficient variable selection method is needed for case-cohort studies where the covariates are only observed in a subset of the sample. Current literature on this topic has been focused on the proportional hazards model. However, in many studies the additive hazards model is preferred over the proportional hazards model either because the proportional hazards assumption is violated or the additive hazards model provides more relevent information to the research question. Motivated by one such study, the Atherosclerosis Risk in Communities (ARIC) study, we investigate the properties of a regularized variable selection procedure in stratified case-cohort design under an additive hazards model with a diverging number of parameters. We establish the consistency and asymptotic normality of the penalized estimator and prove its oracle property. Simulation studies are conducted to assess the finite sample performance of the proposed method with a modified cross-validation tuning parameter selection methods. We apply the variable selection procedure to the ARIC study to demonstrate its practical use.  相似文献   

8.
Abstract.  We propose a new method for fitting proportional hazards models with error-prone covariates. Regression coefficients are estimated by solving an estimating equation that is the average of the partial likelihood scores based on imputed true covariates. For the purpose of imputation, a linear spline model is assumed on the baseline hazard. We discuss consistency and asymptotic normality of the resulting estimators, and propose a stochastic approximation scheme to obtain the estimates. The algorithm is easy to implement, and reduces to the ordinary Cox partial likelihood approach when the measurement error has a degenerate distribution. Simulations indicate high efficiency and robustness. We consider the special case where error-prone replicates are available on the unobserved true covariates. As expected, increasing the number of replicates for the unobserved covariates increases efficiency and reduces bias. We illustrate the practical utility of the proposed method with an Eastern Cooperative Oncology Group clinical trial where a genetic marker, c- myc expression level, is subject to measurement error.  相似文献   

9.
The concepts of relative risk and hazard ratio are generalized for ordinary ordinal and continuous response variables, respectively. Under the generalized concepts, the Cox proportional hazards model with the Breslow's and Efron's methods can be regarded as generalizations of the Mantel–Haenszel estimator for dealing with broader types of covariates and responses. When ordinal responses can be regarded as discretized observations of a hypothetical continuous variable, the estimated relative risks from the Cox model reflect the associations between the responses and covariates. Examples are given to illustrate the generalized concepts and wider applications of the Cox model and the Kaplan–Meier estimator.  相似文献   

10.
We examine the asymptotic and small sample properties of model-based and robust tests of the null hypothesis of no randomized treatment effect based on the partial likelihood arising from an arbitrarily misspecified Cox proportional hazards model. When the distribution of the censoring variable is either conditionally independent of the treatment group given covariates or conditionally independent of covariates given the treatment group, the numerators of the partial likelihood treatment score and Wald tests have asymptotic mean equal to 0 under the null hypothesis, regardless of whether or how the Cox model is misspecified. We show that the model-based variance estimators used in the calculation of the model-based tests are not, in general, consistent under model misspecification, yet using analytic considerations and simulations we show that their true sizes can be as close to the nominal value as tests calculated with robust variance estimators. As a special case, we show that the model-based log-rank test is asymptotically valid. When the Cox model is misspecified and the distribution of censoring depends on both treatment group and covariates, the asymptotic distributions of the resulting partial likelihood treatment score statistic and maximum partial likelihood estimator do not, in general, have a zero mean under the null hypothesis. Here neither the fully model-based tests, including the log-rank test, nor the robust tests will be asymptotically valid, and we show through simulations that the distortion to test size can be substantial.  相似文献   

11.
The linear regression model for right censored data, also known as the accelerated failure time model using the logarithm of survival time as the response variable, is a useful alternative to the Cox proportional hazards model. Empirical likelihood as a non‐parametric approach has been demonstrated to have many desirable merits thanks to its robustness against model misspecification. However, the linear regression model with right censored data cannot directly benefit from the empirical likelihood for inferences mainly because of dependent elements in estimating equations of the conventional approach. In this paper, we propose an empirical likelihood approach with a new estimating equation for linear regression with right censored data. A nested coordinate algorithm with majorization is used for solving the optimization problems with non‐differentiable objective function. We show that the Wilks' theorem holds for the new empirical likelihood. We also consider the variable selection problem with empirical likelihood when the number of predictors can be large. Because the new estimating equation is non‐differentiable, a quadratic approximation is applied to study the asymptotic properties of penalized empirical likelihood. We prove the oracle properties and evaluate the properties with simulated data. We apply our method to a Surveillance, Epidemiology, and End Results small intestine cancer dataset.  相似文献   

12.
The proportional hazards regression model is commonly used to evaluate the relationship between survival and covariates. Covariates are frequently measured with error. Substituting mismeasured values for the true covariates leads to biased estimation. Hu et al. (Biometrics 88 (1998) 447) have proposed to base estimation in the proportional hazards model with covariate measurement error on a joint likelihood for survival and the covariate variable. Nonparametric maximum likelihood estimation (NPMLE) was used and simulations were conducted to assess the asymptotic validity of this approach. In this paper, we derive a rigorous proof of asymptotic normality of the NPML estimators.  相似文献   

13.
The most widely used model for multidimensional survival analysis is the Cox model. This model is semi-parametric, since its hazard function is the product of an unspecified baseline hazard, and a parametric functional form relating the hazard and the covariates. We consider a more flexible and fully nonparametric proportional hazards model, where the functional form of the covariates effect is left unspecified. In this model, estimation is based on the maximum likelihood method. Results obtained from a Monte-Carlo experiment and from real data are presented. Finally, the advantages and the limitations of the approacha are discussed.  相似文献   

14.
This article develops a local partial likelihood technique to estimate the time-dependent coefficients in Cox's regression model. The basic idea is a simple extension of the local linear fitting technique used in the scatterplot smoothing. The coefficients are estimated locally based on the partial likelihood in a window around each time point. Multiple time-dependent covariates are incorporated in the local partial likelihood procedure. The procedure is useful as a diagnostic tool and can be used in uncovering time-dependencies or departure from the proportional hazards model. The programming involved in the local partial likelihood estimation is relatively simple and it can be modified with few efforts from the existing programs for the proportional hazards model. The asymptotic properties of the resulting estimator are established and compared with those from the local constant fitting. A consistent estimator of the asymptotic variance is also proposed. The approach is illustrated by a real data set from the study of gastric cancer patients and a simulation study is also presented.  相似文献   

15.
In this article, we develop a Bayesian variable selection method that concerns selection of covariates in the Poisson change-point regression model with both discrete and continuous candidate covariates. Ranging from a null model with no selected covariates to a full model including all covariates, the Bayesian variable selection method searches the entire model space, estimates posterior inclusion probabilities of covariates, and obtains model averaged estimates on coefficients to covariates, while simultaneously estimating a time-varying baseline rate due to change-points. For posterior computation, the Metropolis-Hastings within partially collapsed Gibbs sampler is developed to efficiently fit the Poisson change-point regression model with variable selection. We illustrate the proposed method using simulated and real datasets.  相似文献   

16.
The proportional hazards (Cox) model is generalized by assuming that at any moment the ratio of hazard rates is depending not only on values of covariates but also on resources used until this moment. Relations with generalized multiplicative, frailty and linear transformation models are considered. A modified partial likelihood function is proposed, and properties of the estimators are investigated.  相似文献   

17.
Abstract.  The traditional Cox proportional hazards regression model uses an exponential relative risk function. We argue that under various plausible scenarios, the relative risk part of the model should be bounded, suggesting also that the traditional model often might overdramatize the hazard rate assessment for individuals with unusual covariates. This motivates our working with proportional hazards models where the relative risk function takes a logistic form. We provide frequentist methods, based on the partial likelihood, and then go on to semiparametric Bayesian constructions. These involve a Beta process for the cumulative baseline hazard function and any prior with a density, for example that dictated by a Jeffreys-type argument, for the regression coefficients. The posterior is derived using machinery for Lévy processes, and a simulation recipe is devised for sampling from the posterior distribution of any quantity. Our methods are illustrated on real data. A Bernshtĕn–von Mises theorem is reached for our class of semiparametric priors, guaranteeing asymptotic normality of the posterior processes.  相似文献   

18.
Abstract.  This paper considers covariate selection for the additive hazards model. This model is particularly simple to study theoretically and its practical implementation has several major advantages to the similar methodology for the proportional hazards model. One complication compared with the proportional model is, however, that there is no simple likelihood to work with. We here study a least squares criterion with desirable properties and show how this criterion can be interpreted as a prediction error. Given this criterion, we define ridge and Lasso estimators as well as an adaptive Lasso and study their large sample properties for the situation where the number of covariates p is smaller than the number of observations. We also show that the adaptive Lasso has the oracle property. In many practical situations, it is more relevant to tackle the situation with large p compared with the number of observations. We do this by studying the properties of the so-called Dantzig selector in the setting of the additive risk model. Specifically, we establish a bound on how close the solution is to a true sparse signal in the case where the number of covariates is large. In a simulation study, we also compare the Dantzig and adaptive Lasso for a moderate to small number of covariates. The methods are applied to a breast cancer data set with gene expression recordings and to the primary biliary cirrhosis clinical data.  相似文献   

19.
Although Cox proportional hazards regression is the default analysis for time to event data, there is typically uncertainty about whether the effects of a predictor are more appropriately characterized by a multiplicative or additive model. To accommodate this uncertainty, we place a model selection prior on the coefficients in an additive-multiplicative hazards model. This prior assigns positive probability, not only to the model that has both additive and multiplicative effects for each predictor, but also to sub-models corresponding to no association, to only additive effects, and to only proportional effects. The additive component of the model is constrained to ensure non-negative hazards, a condition often violated by current methods. After augmenting the data with Poisson latent variables, the prior is conditionally conjugate, and posterior computation can proceed via an efficient Gibbs sampling algorithm. Simulation study results are presented, and the methodology is illustrated using data from the Framingham heart study.  相似文献   

20.
Variable selection is fundamental to high-dimensional statistical modeling in diverse fields of sciences. In our health study, different statistical methods are applied to analyze trauma annual data, collected by 30 General Hospitals in Greece. The dataset consists of 6334 observations and 111 factors that include demographic, transport, and clinical data. The statistical methods employed in this work are the nonconcave penalized likelihood methods, Smoothly Clipped Absolute Deviation, Least Absolute Shrinkage and Selection Operator, and Hard, the maximum partial likelihood estimation method, and the best subset variable selection, adjusted to Cox's proportional hazards model and used to detect possible risk factors, which affect the length of stay in a hospital. A variety of different statistical models are considered, with respect to the combinations of factors while censored observations are present. A comparative survey reveals several differences between results and execution times of each method. Finally, we provide useful biological justification of our results.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号