首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Sensitivity analysis for unmeasured confounding should be reported more often, especially in observational studies. In the standard Cox proportional hazards model, this requires substantial assumptions and can be computationally difficult. The marginal structural Cox proportional hazards model (Cox proportional hazards MSM) with inverse probability weighting has several advantages compared to the standard Cox model, including situations with only one assessment of exposure (point exposure) and time-independent confounders. We describe how simple computations provide sensitivity for unmeasured confounding in a Cox proportional hazards MSM with point exposure. This is achieved by translating the general framework for sensitivity analysis for MSMs by Robins and colleagues to survival time data. Instead of bias-corrected observations, we correct the hazard rate to adjust for a specified amount of unmeasured confounding. As an additional bonus, the Cox proportional hazards MSM is robust against bias from differential loss to follow-up. As an illustration, the Cox proportional hazards MSM was applied in a reanalysis of the association between smoking and depression in a population-based cohort of Norwegian adults. The association was moderately sensitive for unmeasured confounding.  相似文献   

2.
In this paper, we deal with the analysis of case series. The self-controlled case series method (SCCS) was developed to analyse the temporal association between time-varying exposure and an outcome event. We apply the SCCS method to the vaccination data of the German Examination Survey for Children and Adolescents (KiGGS). We illustrate that the standard SCCS method cannot be applied to terminal events such as death. In this situation, an extension of SCCS adjusted for terminal events gives unbiased point estimators. The key question of this paper is whether the general Cox regression model for time-dependent covariates may be an alternative to the adjusted SCCS method for terminal events. In contrast to the SCCS method, Cox regression is included in most software packages (SPSS, SAS, STATA, R, …) and it is easy to use. We can show that Cox regression is applicable to test the null hypothesis. In our KiGGS example without censored data, the Cox regression and the adjusted SCCS method yield point estimates almost identical to the standard SCCS method. We have conducted several simulation studies to complete the comparison of the two methods. The Cox regression shows a tendency to underestimate the true effect with prolonged risk periods and strong effects (Relative Incidence >2). If risk of the event is strongly affected by the age, the adjusted SCCS method slightly overestimates the predefined exposure effect. Cox regression has the same efficiency as the adjusted SCCS method in the simulation.  相似文献   

3.
ABSTRACT

This paper proposes a power-transformed linear quantile regression model for the residual lifetime of competing risks data. The proposed model can describe the association between any quantile of a time-to-event distribution among survivors beyond a specific time point and the covariates. Under covariate-dependent censoring, we develop an estimation procedure with two steps, including an unbiased monotone estimating equation for regression parameters and cumulative sum processes for the Box–Cox transformation parameter. The asymptotic properties of the estimators are also derived. We employ an efficient bootstrap method for the estimation of the variance–covariance matrix. The finite-sample performance of the proposed approaches are evaluated through simulation studies and a real example.  相似文献   

4.
The aim of this study is to determine the effect of informative priors for variables with missing value and to compare Bayesian Cox regression and Cox regression analysis. For this purpose, firstly simulated data sets with different sample size within different missing rate were generated and each of data sets were analysed by Cox regression and Bayesian Cox regression with informative prior. Secondly lung cancer data set as real data set was used for analysis. Consequently, using informative priors for variables with missing value solved the missing data problem.  相似文献   

5.
Summary. We propose a simple estimation procedure for a proportional hazards frailty regression model for clustered survival data in which the dependence is generated by a positive stable distribution. Inferences for the frailty parameter can be obtained by using output from Cox regression analyses. The computational burden is substantially less than that of the other approaches to estimation. The large sample behaviour of the estimator is studied and simulations show that the approximations are appropriate for use with realistic sample sizes. The methods are motivated by studies of familial associations in the natural history of diseases. Their practical utility is illustrated with sib pair data from Beaver Dam, Wisconsin.  相似文献   

6.
Abstract. We propose a spline‐based semiparametric maximum likelihood approach to analysing the Cox model with interval‐censored data. With this approach, the baseline cumulative hazard function is approximated by a monotone B‐spline function. We extend the generalized Rosen algorithm to compute the maximum likelihood estimate. We show that the estimator of the regression parameter is asymptotically normal and semiparametrically efficient, although the estimator of the baseline cumulative hazard function converges at a rate slower than root‐n. We also develop an easy‐to‐implement method for consistently estimating the standard error of the estimated regression parameter, which facilitates the proposed inference procedure for the Cox model with interval‐censored data. The proposed method is evaluated by simulation studies regarding its finite sample performance and is illustrated using data from a breast cosmesis study.  相似文献   

7.
ABSTRACT

This article builds classical and Bayesian testing procedures for choosing between non nested multivariate regression models. Although there are several classical tests for discriminating univariate regressions, only the Cox test is able to consistently handle the multivariate case. We then derive the limiting distribution of the Cox statistic in such a context, correcting an earlier derivation in the literature. Further, we show how to build alternative Bayes factors for the testing of nonnested multivariate linear regression models. In particular, we compute expressions for the posterior Bayes factor, the fractional Bayes factor, and the intrinsic Bayes factor.  相似文献   

8.
We analyze left-truncated and right-censored (LTRC) data using an additive-multiplicative Cox–Aalen model proposed by Scheike and Zhang (2002), which extends the Cox regression model as well as the additive Aalen model. Based on the conditional likelihood function, we derive the weighted least-squared (WLS) estimators for the regression parameters and cumulative intensity functions of the model. The estimators are shown to be consistent and asymptotically normal. A simulation study is conducted to investigate the performance of the proposed estimators.  相似文献   

9.
Abstract.  We study a binary regression model using the complementary log–log link, where the response variable Δ is the indicator of an event of interest (for example, the incidence of cancer, or the detection of a tumour) and the set of covariates can be partitioned as ( X ,  Z ) where Z (real valued) is the primary covariate and X (vector valued) denotes a set of control variables. The conditional probability of the event of interest is assumed to be monotonic in Z , for every fixed X . A finite-dimensional (regression) parameter β describes the effect of X . We show that the baseline conditional probability function (corresponding to X  =  0 ) can be estimated by isotonic regression procedures and develop an asymptotically pivotal likelihood-ratio-based method for constructing (asymptotic) confidence sets for the regression function. We also show how likelihood-ratio-based confidence intervals for the regression parameter can be constructed using the chi-square distribution. An interesting connection to the Cox proportional hazards model under current status censoring emerges. We present simulation results to illustrate the theory and apply our results to a data set involving lung tumour incidence in mice.  相似文献   

10.
In survival analysis, time-dependent covariates are usually present as longitudinal data collected periodically and measured with error. The longitudinal data can be assumed to follow a linear mixed effect model and Cox regression models may be used for modelling of survival events. The hazard rate of survival times depends on the underlying time-dependent covariate measured with error, which may be described by random effects. Most existing methods proposed for such models assume a parametric distribution assumption on the random effects and specify a normally distributed error term for the linear mixed effect model. These assumptions may not be always valid in practice. In this article, we propose a new likelihood method for Cox regression models with error-contaminated time-dependent covariates. The proposed method does not require any parametric distribution assumption on random effects and random errors. Asymptotic properties for parameter estimators are provided. Simulation results show that under certain situations the proposed methods are more efficient than the existing methods.  相似文献   

11.
Pao-sheng Shen 《Statistics》2015,49(3):602-613
For the regression parameter β in the Cox model, there have been several estimates based on different types of approximated likelihood. For right-censored data, Ren and Zhou [Full likelihood inferences in the Cox model: an empirical approach. Ann Inst Statist Math. 2011;63:1005–1018] derive the full likelihood function for (β, F0), where F0 is the baseline distribution function in the Cox model. In this article, we extend their results to left-truncated and right-censored data with discrete covariates. Using the empirical likelihood parameterization, we obtain the full-profile likelihood function for β when covariates are discrete. Simulation results indicate that the maximum likelihood estimator outperforms Cox's partial likelihood estimator in finite samples.  相似文献   

12.
With competing risks data, one often needs to assess the treatment and covariate effects on the cumulative incidence function. Fine and Gray proposed a proportional hazards regression model for the subdistribution of a competing risk with the assumption that the censoring distribution and the covariates are independent. Covariate‐dependent censoring sometimes occurs in medical studies. In this paper, we study the proportional hazards regression model for the subdistribution of a competing risk with proper adjustments for covariate‐dependent censoring. We consider a covariate‐adjusted weight function by fitting the Cox model for the censoring distribution and using the predictive probability for each individual. Our simulation study shows that the covariate‐adjusted weight estimator is basically unbiased when the censoring time depends on the covariates, and the covariate‐adjusted weight approach works well for the variance estimator as well. We illustrate our methods with bone marrow transplant data from the Center for International Blood and Marrow Transplant Research. Here, cancer relapse and death in complete remission are two competing risks.  相似文献   

13.
Most methods for survival prediction from high-dimensional genomic data combine the Cox proportional hazards model with some technique of dimension reduction, such as partial least squares regression (PLS). Applying PLS to the Cox model is not entirely straightforward, and multiple approaches have been proposed. The method of Park et al. (Bioinformatics 18(Suppl. 1):S120–S127, 2002) uses a reformulation of the Cox likelihood to a Poisson type likelihood, thereby enabling estimation by iteratively reweighted partial least squares for generalized linear models. We propose a modification of the method of park et al. (2002) such that estimates of the baseline hazard and the gene effects are obtained in separate steps. The resulting method has several advantages over the method of park et al. (2002) and other existing Cox PLS approaches, as it allows for estimation of survival probabilities for new patients, enables a less memory-demanding estimation procedure, and allows for incorporation of lower-dimensional non-genomic variables like disease grade and tumor thickness. We also propose to combine our Cox PLS method with an initial gene selection step in which genes are ordered by their Cox score and only the highest-ranking k% of the genes are retained, obtaining a so-called supervised partial least squares regression method. In simulations, both the unsupervised and the supervised version outperform other Cox PLS methods.  相似文献   

14.
Regression calibration is a simple method for estimating regression models when covariate data are missing for some study subjects. It consists in replacing an unobserved covariate by an estimator of its conditional expectation given available covariates. Regression calibration has recently been investigated in various regression models such as the linear, generalized linear, and proportional hazards models. The aim of this paper is to investigate the appropriateness of this method for estimating the stratified Cox regression model with missing values of the covariate defining the strata. Despite its practical relevance, this problem has not yet been discussed in the literature. Asymptotic distribution theory is developed for the regression calibration estimator in this setting. A simulation study is also conducted to investigate the properties of this estimator.  相似文献   

15.
Abstract

In some clinical, environmental, or economical studies, researchers are interested in a semi-continuous outcome variable which takes the value zero with a discrete probability and has a continuous distribution for the non-zero values. Due to the measuring mechanism, it is not always possible to fully observe some outcomes, and only an upper bound is recorded. We call this left-censored data and observe only the maximum of the outcome and an independent censoring variable, together with an indicator. In this article, we introduce a mixture semi-parametric regression model. We consider a parametric model to investigate the influence of covariates on the discrete probability of the value zero. For the non-zero part of the outcome, a semi-parametric Cox’s regression model is used to study the conditional hazard function. The different parameters in this mixture model are estimated using a likelihood method. Hereby the infinite dimensional baseline hazard function is estimated by a step function. As results, we show the identifiability and the consistency of the estimators for the different parameters in the model. We study the finite sample behaviour of the estimators through a simulation study and illustrate this model on a practical data example.  相似文献   

16.
Identification of influential genes and clinical covariates on the survival of patients is crucial because it can lead us to better understanding of underlying mechanism of diseases and better prediction models. Most of variable selection methods in penalized Cox models cannot deal properly with categorical variables such as gender and family history. The group lasso penalty can combine clinical and genomic covariates effectively. In this article, we introduce an optimization algorithm for Cox regression with group lasso penalty. We compare our method with other methods on simulated and real microarray data sets.  相似文献   

17.
Abstract.  The traditional Cox proportional hazards regression model uses an exponential relative risk function. We argue that under various plausible scenarios, the relative risk part of the model should be bounded, suggesting also that the traditional model often might overdramatize the hazard rate assessment for individuals with unusual covariates. This motivates our working with proportional hazards models where the relative risk function takes a logistic form. We provide frequentist methods, based on the partial likelihood, and then go on to semiparametric Bayesian constructions. These involve a Beta process for the cumulative baseline hazard function and any prior with a density, for example that dictated by a Jeffreys-type argument, for the regression coefficients. The posterior is derived using machinery for Lévy processes, and a simulation recipe is devised for sampling from the posterior distribution of any quantity. Our methods are illustrated on real data. A Bernshtĕn–von Mises theorem is reached for our class of semiparametric priors, guaranteeing asymptotic normality of the posterior processes.  相似文献   

18.
We analyse longitudinal data on CD4 cell counts from patients who participated in clinical trials that compared two therapeutic treatments: zidovudine and didanosine. The investigators were interested in modelling the CD4 cell count as a function of treatment, age at base-line and disease stage at base-line. Serious concerns can be raised about the normality assumption of CD4 cell counts that is implicit in many methods and therefore an analysis may have to start with a transformation. Instead of assuming that we know the transformation (e.g. logarithmic) that makes the outcome normal and linearly related to the covariates, we estimate the transformation, by using maximum likelihood, within the Box–Cox family. There has been considerable work on the Box–Cox transformation for univariate regression models. Here, we discuss the Box–Cox transformation for longitudinal regression models when the outcome can be missing over time, and we also implement a maximization method for the likelihood, assumming that the missing data are missing at random.  相似文献   

19.
In this article, we propose a parametric model for the distribution of time to first event when events are overdispersed and can be properly fitted by a Negative Binomial distribution. This is a very common situation in medical statistics, when the occurrence of events is summarized as a count for each patient and the simple Poisson model is not adequate to account for overdispersion of data. In this situation, studying the time of occurrence of the first event can be of interest. From the Negative Binomial distribution of counts, we derive a new parametric model for time to first event and apply it to fit the distribution of time to first relapse in multiple sclerosis (MS). We develop the regression model with methods for covariate estimation. We show that, as the Negative Binomial model properly fits relapse counts data, this new model matches quite perfectly the distribution of time to first relapse, as tested in two large datasets of MS patients. Finally we compare its performance, when fitting time to first relapse in MS, with other models widely used in survival analysis (the semiparametric Cox model and the parametric exponential, Weibull, log-logistic and log-normal models).  相似文献   

20.
A common problem for longitudinal data analyses is that subjects follow-up is irregular, often related to the past outcome or other factors associated with the outcome measure that are not included in the regression model. Analyses unadjusted for outcome-dependent follow-up yield biased estimates. We propose a longitudinal data analysis that can provide consistent estimates in regression models that are subject to outcome-dependent follow-up. We focus on semiparametric marginal log-link regression with arbitrary unspecified baseline function. Based on estimating equations, the proposed class of estimators are root n consistent and asymptotically normal. We present simulation studies that assess the performance of the estimators under finite samples. We illustrate our approach using data from a health services research study.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号