首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
In this paper, we consider the analysis of recurrent event data that examines the differences between two treatments. The outcomes that are considered in the analysis are the pre-randomisation event count and post-randomisation times to first and second events with associated cure fractions. We develop methods that allow pre-randomisation counts and two post-randomisation survival times to be jointly modelled under a Poisson process framework, assuming that outcomes are predicted by (unobserved) event rates. We apply these methods to data that examine the difference between immediate and deferred treatment policies in patients presenting with single seizures or early epilepsy. We find evidence to suggest that post-randomisation seizure rates change at randomisation and following a first seizure after randomisation. We also find that there are cure rates associated with the post-randomisation times to first and second seizures. The increase in power over standard survival techniques, offered by the joint models that we propose, resulted in more precise estimates of the treatment effect and the ability to detect interactions with covariate effects.  相似文献   

2.
In clinical trials with interim analyses planned at pre-specified event counts, one may wish to predict the times of these landmark events as a tool for logistical planning. Currently available methods use either a parametric approach based on an exponential model for survival (Bagiella and Heitjan, Statistics in Medicine 2001; 20:2055) or a non-parametric approach based on the Kaplan-Meier estimate (Ying et al., Clinical Trials 2004; 1:352). Ying et al. (2004) demonstrated the trade-off between bias and variance in these models; the exponential method is highly efficient when its assumptions hold but potentially biased when they do not, whereas the non-parametric method has minimal bias and is well calibrated under a range of survival models but typically gives wider prediction intervals and may fail to produce useful predictions early in the trial. As a potential compromise, we propose here to make predictions under a Weibull survival model. Computations are somewhat more difficult than with the simpler exponential model, but Monte Carlo studies show that predictions are robust under a broader range of assumptions. We demonstrate the method using data from a trial of immunotherapy for chronic granulomatous disease.  相似文献   

3.
Recent work has shown that the presence of ties between an outcome event and the time that a binary covariate changes or jumps can lead to biased estimates of regression coefficients in the Cox proportional hazards model. One proposed solution is the Equally Weighted method. The coefficient estimate of the Equally Weighted method is defined to be the average of the coefficient estimates of the Jump Before Event method and the Jump After Event method, where these two methods assume that the jump always occurs before or after the event time, respectively. In previous work, the bootstrap method was used to estimate the standard error of the Equally Weighted coefficient estimate. However, the bootstrap approach was computationally intensive and resulted in overestimation. In this article, two new methods for the estimation of the Equally Weighted standard error are proposed. Three alternative methods for estimating both the regression coefficient and the corresponding standard error are also proposed. All the proposed methods are easy to implement. The five methods are investigated using a simulation study and are illustrated using two real datasets.  相似文献   

4.
In biomedical and public health research, both repeated measures of biomarkers Y as well as times T to key clinical events are often collected for a subject. The scientific question is how the distribution of the responses [ T , Y | X ] changes with covariates X . [ T | X ] may be the focus of the estimation where Y can be used as a surrogate for T . Alternatively, T may be the time to drop-out in a study in which [ Y | X ] is the target for estimation. Also, the focus of a study might be on the effects of covariates X on both T and Y or on some underlying latent variable which is thought to be manifested in the observable outcomes. In this paper, we present a general model for the joint analysis of [ T , Y | X ] and apply the model to estimate [ T | X ] and other related functionals by using the relevant information in both T and Y . We adopt a latent variable formulation like that of Fawcett and Thomas and use it to estimate several quantities of clinical relevance to determine the efficacy of a treatment in a clinical trial setting. We use a Markov chain Monte Carlo algorithm to estimate the model's parameters. We illustrate the methodology with an analysis of data from a clinical trial comparing risperidone with a placebo for the treatment of schizophrenia.  相似文献   

5.
In this article, an additive rate model is proposed for clustered recurrent event with a terminal event. The subjects are clustered by some property. For the clustered subjects, the recurrent event is precluded by the death. An estimating equation is developed for the model parameter and the baseline rate function. The asymptotic properties of the resulting estimators are established. In addition, a goodness-of-fit test is presented to assess the adequacy of the model. The finite-sample behavior of the proposed estimators is evaluated through simulation studies, and an application to a bladder cancer data is illustrated.  相似文献   

6.
Cost assessment serves as an essential part in economic evaluation of medical interventions. In many studies, costs as well as survival data are frequently censored. Standard survival analysis techniques are often invalid for censored costs, due to the induced dependent censoring problem. Owing to high skewness in many cost data, it is desirable to estimate the median costs, which will be available with estimated survival function of costs. We propose a kernel-based survival estimator for costs, which is monotone, consistent, and more efficient than several existing estimators. We conduct numerical studies to examine the finite-sample performance of the proposed estimator.  相似文献   

7.
ABSTRACT

In many longitudinal studies, there may exist informative observation times and a dependent terminal event that stops the follow-up. In this paper, we propose a joint model for analysis of longitudinal data with informative observation times and a dependent terminal event via two latent variables. Estimation procedures are developed for parameter estimation, and asymptotic properties of the proposed estimators are derived. Simulation studies demonstrate that the proposed method performs well for practical settings. An application to a bladder cancer study is illustrated.  相似文献   

8.
Bivariate recurrent event data are observed when subjects are at risk of experiencing two different type of recurrent events. In this paper, our interest is to suggest statistical model when there is a substantial portion of subjects not experiencing recurrent events but having a terminal event. In a context of recurrent event data, zero events can be related with either the risk free group or a terminal event. For simultaneously reflecting both a zero inflation and a terminal event in a context of bivariate recurrent event data, a joint model is implemented with bivariate frailty effects. Simulation studies are performed to evaluate the suggested models. Infection data from AML (acute myeloid leukemia) patients are analyzed as an application.  相似文献   

9.
Summary.  A common objective in longitudinal studies is the joint modelling of a longitudinal response with a time-to-event outcome. Random effects are typically used in the joint modelling framework to explain the interrelationships between these two processes. However, estimation in the presence of random effects involves intractable integrals requiring numerical integration. We propose a new computational approach for fitting such models that is based on the Laplace method for integrals that makes the consideration of high dimensional random-effects structures feasible. Contrary to the standard Laplace approximation, our method requires much fewer repeated measurements per individual to produce reliable results.  相似文献   

10.
11.
12.
Recurrent event data arise in many biomedical and engineering studies when failure events can occur repeatedly over time for each study subject. In this article, we are interested in nonparametric estimation of the hazard function for gap time. A penalized likelihood model is proposed to estimate the hazard as a function of both gap time and covariate. Method for smoothing parameter selection is developed from subject-wise cross-validation. Confidence intervals for the hazard function are derived using the Bayes model of the penalized likelihood. An eigenvalue analysis establishes the asymptotic convergence rates of the relevant estimates. Empirical studies are performed to evaluate various aspects of the method. The proposed technique is demonstrated through an application to the well-known bladder tumor cancer data.  相似文献   

13.
Simultaneous confidence bands provide a useful adjunct to the popular Kaplan–Meier product limit estimator for a survival function, particularly when results are displayed graphically. They allow an assessment of the magnitude of sampling errors and provide a graphical view of a formal goodness-of-fit test. In this paper we evaluate a modified version of Nair's (1981) simultaneous confidence bands. The modification is based on a logistic transformation of the Kaplan–Meier estimator. We show that the modified bands have some important practical advantages.  相似文献   

14.
The medical costs in an ageing society substantially increase when the incidences of chronic diseases, disabilities and inability to live independently are high. Healthy lifestyles not only affect elderly individuals but also influence the entire community. When assessing treatment efficacy, survival and quality of life should be considered simultaneously. This paper proposes the joint likelihood approach for modelling survival and longitudinal binary covariates simultaneously. Because some unobservable information is present in the model, the Monte Carlo EM algorithm and Metropolis-Hastings algorithm are used to find the estimators. Monte Carlo simulations are performed to evaluate the performance of the proposed model based on the accuracy and precision of the estimates. Real data are used to demonstrate the feasibility of the proposed model.  相似文献   

15.
In clinical trials, it may be of interest taking into account physical and emotional well-being in addition to survival when comparing treatments. Quality-adjusted survival time has the advantage of incorporating information about both survival time and quality-of-life. In this paper, we discuss the estimation of the expected value of the quality-adjusted survival, based on multistate models for the sojourn times in health states. Semiparametric and parametric (with exponential distribution) approaches are considered. A simulation study is presented to evaluate the performance of the proposed estimator and the jackknife resampling method is used to compute bias and variance of the estimator.  相似文献   

16.
Lee et al. in 2016 proposed a nonparametric estimator of the joint distribution of the gap time between transplant and the first infection and the following gap times between consecutive infections. In this article, we propose an alternative estimator based on the inverse-probability weighted (IPW) approach. Asymptotic properties of the proposed estimator are established . Simulation results indicate that the IPW estimator performs as well as the estimator proposed by Lee et al. We also propose an IPW estimator for estimating the joint distribution function of the gap times between consecutive recurrent events beyond the first episode.  相似文献   

17.
The development of models and methods for cure rate estimation has recently burgeoned into an important subfield of survival analysis. Much of the literature focuses on the standard mixture model. Recently, process-based models have been suggested. We focus on several models based on first passage times for Wiener processes. Whitmore and others have studied these models in a variety of contexts. Lee and Whitmore (Stat Sci 21(4):501–513, 2006) give a comprehensive review of a variety of first hitting time models and briefly discuss their potential as cure rate models. In this paper, we study the Wiener process with negative drift as a possible cure rate model but the resulting defective inverse Gaussian model is found to provide a poor fit in some cases. Several possible modifications are then suggested, which improve the defective inverse Gaussian. These modifications include: the inverse Gaussian cure rate mixture model; a mixture of two inverse Gaussian models; incorporation of heterogeneity in the drift parameter; and the addition of a second absorbing barrier to the Wiener process, representing an immunity threshold. This class of process-based models is a useful alternative to the standard model and provides an improved fit compared to the standard model when applied to many of the datasets that we have studied. Implementation of this class of models is facilitated using expectation-maximization (EM) algorithms and variants thereof, including the gradient EM algorithm. Parameter estimates for each of these EM algorithms are given and the proposed models are applied to both real and simulated data, where they perform well.  相似文献   

18.
Recurrent event data are often encountered in longitudinal follow-up studies in many important areas such as biomedical science, econometrics, reliability, criminology and demography. Multiplicative marginal rates models have been used extensively to analyze recurrent event data, but often fail to fit the data adequately. In addition, the analysis is complicated by excess zeros in the data as well as the presence of a terminal event that precludes further recurrence. To address these problems, we propose a semiparametric model with an additive rate function and an unspecified baseline to analyze recurrent event data, which includes a parameter to accommodate excess zeros and a frailty term to account for a terminal event. Local likelihood procedure is applied to estimate the parameters, and the asymptotic properties of the estimators are established. A simulation study is conducted to evaluate the performance of the proposed methods, and an example of their application is presented on a set of tumor recurrent data for bladder cancer.  相似文献   

19.
We develop a sample size methodology that achieves specified Type-1 and Type-2 error rates when comparing the survivor functions of multiple treatment groups versus a control group. The designs will control family-wise Type-1 error rate. We assume the family of Weibull distributions adequately describes the underlying survivor functions, and we separately consider three of the most common study scenarios: (a) complete samples; (b) Type-1 censoring with a common censoring time; and (c) Type-1 censoring with an accrual period. A mice longevity study comparing the effect on survival of multiple low-calorie diets is used to motivate our work on this problem.  相似文献   

20.
Panel count data often occur in a long-term study where the primary end point is the time to a specific event and each subject may experience multiple recurrences of this event. Furthermore, suppose that it is not feasible to keep subjects under observation continuously and the numbers of recurrences for each subject are only recorded at several distinct time points over the study period. Moreover, the set of observation times may vary from subject to subject. In this paper, regression methods, which are derived under simple semiparametric models, are proposed for the analysis of such longitudinal count data. Especially, we consider the situation when both observation and censoring times may depend on covariates. The new procedures are illustrated with data from a well-known cancer study.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号