首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Yu  Tingting  Wu  Lang  Gilbert  Peter 《Lifetime data analysis》2019,25(2):229-258

In HIV vaccine studies, longitudinal immune response biomarker data are often left-censored due to lower limits of quantification of the employed immunological assays. The censoring information is important for predicting HIV infection, the failure event of interest. We propose two approaches to addressing left censoring in longitudinal data: one that makes no distributional assumptions for the censored data—treating left censored values as a “point mass” subgroup—and the other makes a distributional assumption for a subset of the censored data but not for the remaining subset. We develop these two approaches to handling censoring for joint modelling of longitudinal and survival data via a Cox proportional hazards model fit by h-likelihood. We evaluate the new methods via simulation and analyze an HIV vaccine trial data set, finding that longitudinal characteristics of the immune response biomarkers are highly associated with the risk of HIV infection.

  相似文献   

2.
In this paper we introduce a new three-parameter exponential-type distribution. The new distribution is quite flexible and can be used effectively in modeling survival data and reliability problems. It can have constant, decreasing, increasing, upside-down bathtub and bathtub-shaped hazard rate functions. It also generalizes some well-known distributions. We discuss maximum likelihood estimation of the model parameters for complete sample and for censored sample. Additionally, we formulate a new cure rate survival model by assuming that the number of competing causes of the event of interest has the Poisson distribution and the time to this event follows the proposed distribution. Maximum likelihood estimation of the model parameters of the new cure rate survival model is discussed for complete sample and censored sample. Two applications to real data are provided to illustrate the flexibility of the new model in practice.  相似文献   

3.
In some applications, the failure time of interest is the time from an originating event to a failure event while both event times are interval censored. We propose fitting Cox proportional hazards models to this type of data using a spline‐based sieve maximum marginal likelihood, where the time to the originating event is integrated out in the empirical likelihood function of the failure time of interest. This greatly reduces the complexity of the objective function compared with the fully semiparametric likelihood. The dependence of the time of interest on time to the originating event is induced by including the latter as a covariate in the proportional hazards model for the failure time of interest. The use of splines results in a higher rate of convergence of the estimator of the baseline hazard function compared with the usual non‐parametric estimator. The computation of the estimator is facilitated by a multiple imputation approach. Asymptotic theory is established and a simulation study is conducted to assess its finite sample performance. It is also applied to analyzing a real data set on AIDS incubation time.  相似文献   

4.
Censoring of a longitudinal outcome often occurs when data are collected in a biomedical study and where the interest is in the survival and or longitudinal experiences of a study population. In the setting considered herein, we encountered upper and lower censored data as the result of restrictions imposed on measurements from a kinetic model producing “biologically implausible” kidney clearances. The goal of this paper is to outline the use of a joint model to determine the association between a censored longitudinal outcome and a time to event endpoint. This paper extends Guo and Carlin's [6] paper to accommodate censored longitudinal data, in a commercially available software platform, by linking a mixed effects Tobit model to a suitable parametric survival distribution. Our simulation results showed that our joint Tobit model outperforms a joint model made up of the more naïve or “fill-in” method for the longitudinal component. In this case, the upper and/or lower limits of censoring are replaced by the limit of detection. We illustrated the use of this approach with example data from the hemodialysis (HEMO) study [3] and examined the association between doubly censored kidney clearance values and survival.  相似文献   

5.
In biostatistical applications interest often focuses on the estimation of the distribution of time between two consecutive events. If the initial event time is observed and the subsequent event time is only known to be larger or smaller than an observed point in time, then the data is described by the well-understood singly censored current status model, also known as interval censored data, case I. Jewell et al. (1994) extended this current status model by allowing the initial time to be unobserved, with its distribution over an observed interval [A, B] known; the data is referred to as doubly censored current status data. This model has applications in AIDS partner studies. If the initial time is known to be uniformly distribute d, the model reduces to a submodel of the current status model with the same asymptotic information bounds as in the current status model, but the distribution of interest is essentially the derivative of the distribution of interest in the current status model. As a consequence the non-parametric maximum likelihood estimator is inconsistent. Moreover, this submodel contains only smooth heavy tailed distributions for which no moments exist. In this paper, we discuss the connection between the singly censored current status model and the doubly censored current status model (for the uniform initial time) in detail and explain the difficulties in estimation which arise in the doubly censored case. We propose a regularized MLE corresponding with the current status model. We prove rate results, efficiency of smooth functionals of the regularized MLE, and present a generally applicable efficient method for estimation of regression parameters, which does not rely on the existence of moments. We also discuss extending these ideas to a non-uniform distribution for the initial time.  相似文献   

6.
In this article, we investigate the quantile regression analysis for semi-competing risks data in which a non-terminal event may be dependently censored by a terminal event. Due to the dependent censoring, the estimation of quantile regression coefficients on the non-terminal event becomes difficult. In order to handle this problem, we assume Archimedean Copula to specify the dependence of the non-terminal event and the terminal event. Portnoy [Censored regression quantiles. J Amer Statist Assoc. 2003;98:1001–1012] considered the quantile regression model under right-censoring data. We extend his approach to construct a weight function, and then impose the weight function to estimate the quantile regression parameter for the non-terminal event under semi-competing risks data. We also prove the consistency and asymptotic properties for the proposed estimator. According to the simulation studies, the performance of our proposed method is good. We also apply our suggested approach to analyse a real data.  相似文献   

7.
We formulate a new cure rate survival model by assuming that the number of competing causes of the event of interest has the Poisson distribution, and the time to this event has the generalized linear failure rate distribution. A new distribution to analyze lifetime data is defined from the proposed cure rate model, and its quantile function as well as a general expansion for the moments is derived. We estimate the parameters of the model with cure rate in the presence of covariates for censored observations using maximum likelihood and derive the observed information matrix. We obtain the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes and present some ways to perform global influence analysis. The usefulness of the proposed cure rate survival model is illustrated in an application to real data.  相似文献   

8.
Li  Shuwei  Sun  Jianguo  Tian  Tian  Cui  Xia 《Lifetime data analysis》2020,26(2):315-338
Lifetime Data Analysis - Doubly censored failure time data occur when the failure time of interest represents the elapsed time between two events, an initial event and a subsequent event, and the...  相似文献   

9.
In biostatistical applications interest often focuses on the estimation of the distribution of time T between two consecutive events. If the initial event time is observed and the subsequent event time is only known to be larger or smaller than an observed point in time, then the data is described by the well understood singly censored current status model, also known as interval censored data, case I. Jewell et al. (1994) extended this current status model by allowing the initial time to be unobserved, but with its distribution over an observed interval ' A, B ' known to be uniformly distributed; the data is referred to as doubly censored current status data. These authors used this model to handle application in AIDS partner studies focusing on the NPMLE of the distribution G of T . The model is a submodel of the current status model, but the distribution G is essentially the derivative of the distribution of interest F in the current status model. In this paper we establish that the NPMLE of G is uniformly consistent and that the resulting estimators for the n 1/2-estimable parameters are efficient. We propose an iterative weighted pool-adjacent-violator-algorithm to compute the estimator. It is also shown that, without smoothness assumptions, the NPMLE of F converges at rate n −2/5 in L 2-norm while the NPMLE of F in the non-parametric current status data model converges at rate n −1/3 in L 2-norm, which shows that there is a substantial gain in using the submodel information.  相似文献   

10.
Survival studies usually collect on each participant, both duration until some terminal event and repeated measures of a time-dependent covariate. Such a covariate is referred to as an internal time-dependent covariate. Usually, some subjects drop out of the study before occurence of the terminal event of interest. One may then wish to evaluate the relationship between time to dropout and the internal covariate. The Cox model is a standard framework for that purpose. Here, we address this problem in situations where the value of the covariate at dropout is unobserved. We suggest a joint model which combines a first-order Markov model for the longitudinaly measured covariate with a time-dependent Cox model for the dropout process. We consider maximum likelihood estimation in this model and show how estimation can be carried out via the EM-algorithm. We state that the suggested joint model may have applications in the context of longitudinal data with nonignorable dropout. Indeed, it can be viewed as generalizing Diggle and Kenward's model (1994) to situations where dropout may occur at any point in time and may be censored. Hence we apply both models and compare their results on a data set concerning longitudinal measurements among patients in a cancer clinical trial.  相似文献   

11.
Summary.  In many longitudinal studies, a subject's response profile is closely associated with his or her risk of experiencing a related event. Examples of such event risks include recurrence of disease, relapse, drop-out and non-compliance. When evaluating the effect of a treatment, it is sometimes of interest to consider the joint process consisting of both the response and the risk of an associated event. Motivated by a prevention of depression study among patients with malignant melanoma, we examine a joint model that incorporates the risk of discontinuation into the analysis of serial depression measures. We present a maximum likelihood estimator for the mean response and event risk vectors. We test hypotheses about functions of mean depression and withdrawal risk profiles from our joint model, predict depression from updated patient histories, characterize associations between components of the joint process and estimate the probability that a patient's depression and risk of withdrawal exceed specified levels. We illustrate the application of our joint model by using the depression data.  相似文献   

12.
In medical research, it is common to have doubly censored survival data: origin time and event time are both subject to censoring. In this paper, we review simple and probability-based methods that are used to impute interval censored origin time and compare the performance of these methods through extensive simulations in the one-sample problem, two-sample problem and Cox regression model problem. The use of a bootstrap procedure for inference is demonstrated.  相似文献   

13.
In biomedical studies where the event of interest is recurrent (e.g., hospitalization), it is often the case that the recurrent event sequence is subject to being stopped by a terminating event (e.g., death). In comparing treatment options, the marginal recurrent event mean is frequently of interest. One major complication in the recurrent/terminal event setting is that censoring times are not known for subjects observed to die, which renders standard risk set based methods of estimation inapplicable. We propose two semiparametric methods for estimating the difference or ratio of treatment-specific marginal mean numbers of events. The first method involves imputing unobserved censoring times, while the second methods uses inverse probability of censoring weighting. In each case, imbalances in the treatment-specific covariate distributions are adjusted out through inverse probability of treatment weighting. After the imputation and/or weighting, the treatment-specific means (then their difference or ratio) are estimated nonparametrically. Large-sample properties are derived for each of the proposed estimators, with finite sample properties assessed through simulation. The proposed methods are applied to kidney transplant data.  相似文献   

14.
Adjusted variable plots are useful in linear regression for outlier detection and for qualitative evaluation of the fit of a model. In this paper, we extend adjusted variable plots to Cox's proportional hazards model for possibly censored survival data. We propose three different plots: a risk level adjusted variable (RLAV) plot in which each observation in each risk set appears, a subject level adjusted variable (SLAV) plot in which each subject is represented by one point, and an event level adjusted variable (ELAV) plot in which the entire risk set at each failure event is represented by a single point. The latter two plots are derived from the RLAV by combining multiple points. In each point, the regression coefficient and standard error from a Cox proportional hazards regression is obtained by a simple linear regression through the origin fit to the coordinates of the pictured points. The plots are illustrated with a reanalysis of a dataset of 65 patients with multiple myeloma.  相似文献   

15.
The linear regression model for right censored data, also known as the accelerated failure time model using the logarithm of survival time as the response variable, is a useful alternative to the Cox proportional hazards model. Empirical likelihood as a non‐parametric approach has been demonstrated to have many desirable merits thanks to its robustness against model misspecification. However, the linear regression model with right censored data cannot directly benefit from the empirical likelihood for inferences mainly because of dependent elements in estimating equations of the conventional approach. In this paper, we propose an empirical likelihood approach with a new estimating equation for linear regression with right censored data. A nested coordinate algorithm with majorization is used for solving the optimization problems with non‐differentiable objective function. We show that the Wilks' theorem holds for the new empirical likelihood. We also consider the variable selection problem with empirical likelihood when the number of predictors can be large. Because the new estimating equation is non‐differentiable, a quadratic approximation is applied to study the asymptotic properties of penalized empirical likelihood. We prove the oracle properties and evaluate the properties with simulated data. We apply our method to a Surveillance, Epidemiology, and End Results small intestine cancer dataset.  相似文献   

16.
Bivariate recurrent event data are observed when subjects are at risk of experiencing two different type of recurrent events. In this paper, our interest is to suggest statistical model when there is a substantial portion of subjects not experiencing recurrent events but having a terminal event. In a context of recurrent event data, zero events can be related with either the risk free group or a terminal event. For simultaneously reflecting both a zero inflation and a terminal event in a context of bivariate recurrent event data, a joint model is implemented with bivariate frailty effects. Simulation studies are performed to evaluate the suggested models. Infection data from AML (acute myeloid leukemia) patients are analyzed as an application.  相似文献   

17.
Parametric models for interval censored data can now easily be fitted with minimal programming in certain standard statistical software packages. Regression equations can be introduced, both for the location and for the dispersion parameters. Finite mixture models can also be fitted, with a point mass on right (or left) censored observations, to allow for individuals who cannot have the event (or already have it). This mixing probability can also be allowed to follow a regression equation.Here, models based on nine different distributions are compared for three examples of heavily censored data as well as a set of simulated data. We find that, for parametric models, interval censoring can often be ignored and that the density, at centres of intervals, can be used instead in the likelihood function, although the approximation is not always reliable. In the context of heavily interval censored data, the conclusions from parametric models are remarkably robust with changing distributional assumptions and generally more informative than the corresponding non-parametric models.  相似文献   

18.
This paper presents methods for checking the goodness-of-fit of the additive risk model with p(> 2)-dimensional time-invariant covariates. The procedures are an extension of Kim and Lee (1996) who developed a test to assess the additive risk assumption for two-sample censored data. We apply the proposed tests to survival data from South Wales nikel refinery workers. Simulation studies are carried out to investigate the performance of the proposed tests for practical sample sizes.  相似文献   

19.
An outcome-dependent sampling (ODS) design is a retrospective sampling scheme where one observes the primary exposure variables with a probability that depends on the observed value of the outcome variable. When the outcome of interest is failure time, the observed data are often censored. By allowing the selection of the supplemental samples depends on whether the event of interest happens or not and oversampling subjects from the most informative regions, ODS design for the time-to-event data can reduce the cost of the study and improve the efficiency. We review recent progresses and advances in research on ODS designs with failure time data. This includes researches on ODS related designs like case–cohort design, generalized case–cohort design, stratified case–cohort design, general failure-time ODS design, length-biased sampling design and interval sampling design.  相似文献   

20.
We propose a new cure rate survival model by assuming that the initial number of competing causes of the event of interest follows a Poisson distribution and the time to event has the odd log-logistic generalized half-normal distribution. This survival model describes a realistic interpretation for the biological mechanism of the event of interest. We estimate the model parameters using maximum likelihood. For different sample sizes, various simulation scenarios are performed. We propose the diagnostics and residual analysis to verify the model assumptions. The potentiality of the new cure rate model is illustrated by means of a real data.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号