首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 307 毫秒
1.
Finite mixture models are currently used to analyze heterogeneous longitudinal data. By releasing the homogeneity restriction of nonlinear mixed-effects (NLME) models, finite mixture models not only can estimate model parameters but also cluster individuals into one of the pre-specified classes with class membership probabilities. This clustering may have clinical significance, which might be associated with a clinically important binary outcome. This article develops a joint modeling of a finite mixture of NLME models for longitudinal data in the presence of covariate measurement errors and a logistic regression for a binary outcome, linked by individual latent class indicators, under a Bayesian framework. Simulation studies are conducted to assess the performance of the proposed joint model and a naive two-step model, in which finite mixture model and logistic regression are fitted separately, followed by an application to a real data set from an AIDS clinical trial, in which the viral dynamics and dichotomized time to the first decline of CD4/CD8 ratio are analyzed jointly.  相似文献   

2.
We compare the commonly used two-step methods and joint likelihood method for joint models of longitudinal and survival data via extensive simulations. The longitudinal models include LME, GLMM, and NLME models, and the survival models include Cox models and AFT models. We find that the full likelihood method outperforms the two-step methods for various joint models, but it can be computationally challenging when the dimension of the random effects in the longitudinal model is not small. We thus propose an approximate joint likelihood method which is computationally efficient. We find that the proposed approximation method performs well in the joint model context, and it performs better for more “continuous” longitudinal data. Finally, a real AIDS data example shows that patients with higher initial viral load or lower initial CD4 are more likely to drop out earlier during an anti-HIV treatment.  相似文献   

3.
Over the last 20 or more years a lot of clinical applications and methodological development in the area of joint models of longitudinal and time-to-event outcomes have come up. In these studies, patients are followed until an event, such as death, occurs. In most of the work, using subject-specific random-effects as frailty, the dependency of these two processes has been established. In this article, we propose a new joint model that consists of a linear mixed-effects model for longitudinal data and an accelerated failure time model for the time-to-event data. These two sub-models are linked via a latent random process. This model will capture the dependency of the time-to-event on the longitudinal measurements more directly. Using standard priors, a Bayesian method has been developed for estimation. All computations are implemented using OpenBUGS. Our proposed method is evaluated by a simulation study, which compares the conditional model with a joint model with local independence by way of calibration. Data on Duchenne muscular dystrophy (DMD) syndrome and a set of data in AIDS patients have been analysed.  相似文献   

4.
We propose a nonlinear mixed-effects framework to jointly model longitudinal and repeated time-to-event data. A parametric nonlinear mixed-effects model is used for the longitudinal observations and a parametric mixed-effects hazard model for repeated event times. We show the importance for parameter estimation of properly calculating the conditional density of the observations (given the individual parameters) in the presence of interval and/or right censoring. Parameters are estimated by maximizing the exact joint likelihood with the stochastic approximation expectation–maximization algorithm. This workflow for joint models is now implemented in the Monolix software, and illustrated here on five simulated and two real datasets.  相似文献   

5.
In many medical studies, patients are followed longitudinally and interest is on assessing the relationship between longitudinal measurements and time to an event. Recently, various authors have proposed joint modeling approaches for longitudinal and time-to-event data for a single longitudinal variable. These joint modeling approaches become intractable with even a few longitudinal variables. In this paper we propose a regression calibration approach for jointly modeling multiple longitudinal measurements and discrete time-to-event data. Ideally, a two-stage modeling approach could be applied in which the multiple longitudinal measurements are modeled in the first stage and the longitudinal model is related to the time-to-event data in the second stage. Biased parameter estimation due to informative dropout makes this direct two-stage modeling approach problematic. We propose a regression calibration approach which appropriately accounts for informative dropout. We approximate the conditional distribution of the multiple longitudinal measurements given the event time by modeling all pairwise combinations of the longitudinal measurements using a bivariate linear mixed model which conditions on the event time. Complete data are then simulated based on estimates from these pairwise conditional models, and regression calibration is used to estimate the relationship between longitudinal data and time-to-event data using the complete data. We show that this approach performs well in estimating the relationship between multivariate longitudinal measurements and the time-to-event data and in estimating the parameters of the multiple longitudinal process subject to informative dropout. We illustrate this methodology with simulations and with an analysis of primary biliary cirrhosis (PBC) data.  相似文献   

6.
In this paper, we investigate Bayesian generalized nonlinear mixed‐effects (NLME) regression models for zero‐inflated longitudinal count data. The methodology is motivated by and applied to colony forming unit (CFU) counts in extended bactericidal activity tuberculosis (TB) trials. Furthermore, for model comparisons, we present a generalized method for calculating the marginal likelihoods required to determine Bayes factors. A simulation study shows that the proposed zero‐inflated negative binomial regression model has good accuracy, precision, and credibility interval coverage. In contrast, conventional normal NLME regression models applied to log‐transformed count data, which handle zero counts as left censored values, may yield credibility intervals that undercover the true bactericidal activity of anti‐TB drugs. We therefore recommend that zero‐inflated NLME regression models should be fitted to CFU count on the original scale, as an alternative to conventional normal NLME regression models on the logarithmic scale.  相似文献   

7.
The joint models for longitudinal data and time-to-event data have recently received numerous attention in clinical and epidemiologic studies. Our interest is in modeling the relationship between event time outcomes and internal time-dependent covariates. In practice, the longitudinal responses often show non linear and fluctuated curves. Therefore, the main aim of this paper is to use penalized splines with a truncated polynomial basis to parameterize the non linear longitudinal process. Then, the linear mixed-effects model is applied to subject-specific curves and to control the smoothing. The association between the dropout process and longitudinal outcomes is modeled through a proportional hazard model. Two types of baseline risk functions are considered, namely a Gompertz distribution and a piecewise constant model. The resulting models are referred to as penalized spline joint models; an extension of the standard joint models. The expectation conditional maximization (ECM) algorithm is applied to estimate the parameters in the proposed models. To validate the proposed algorithm, extensive simulation studies were implemented followed by a case study. In summary, the penalized spline joint models provide a new approach for joint models that have improved the existing standard joint models.  相似文献   

8.
In many clinical studies, longitudinal biomarkers are often used to monitor the progression of a disease. For example, in a kidney transplant study, the glomerular filtration rate (GFR) is used as a longitudinal biomarker to monitor the progression of the kidney function and the patient''s state of survival that is characterized by multiple time-to-event outcomes, such as kidney transplant failure and death. It is known that the joint modelling of longitudinal and survival data leads to a more accurate and comprehensive estimation of the covariates'' effect. While most joint models use the longitudinal outcome as a covariate for predicting survival, very few models consider the further decomposition of the variation within the longitudinal trajectories and its effect on survival. We develop a joint model that uses functional principal component analysis (FPCA) to extract useful features from the longitudinal trajectories and adopt the competing risk model to handle multiple time-to-event outcomes. The longitudinal trajectories and the multiple time-to-event outcomes are linked via the shared functional features. The application of our model on a real kidney transplant data set reveals the significance of these functional features, and a simulation study is carried out to validate the accurateness of the estimation method.  相似文献   

9.
ABSTRACT

Joint models are statistical tools for estimating the association between time-to-event and longitudinal outcomes. One challenge to the application of joint models is its computational complexity. Common estimation methods for joint models include a two-stage method, Bayesian and maximum-likelihood methods. In this work, we consider joint models of a time-to-event outcome and multiple longitudinal processes and develop a maximum-likelihood estimation method using the expectation–maximization algorithm. We assess the performance of the proposed method via simulations and apply the methodology to a data set to determine the association between longitudinal systolic and diastolic blood pressure measures and time to coronary artery disease.  相似文献   

10.
We implement a joint model for mixed multivariate longitudinal measurements, applied to the prediction of time until lung transplant or death in idiopathic pulmonary fibrosis. Specifically, we formulate a unified Bayesian joint model for the mixed longitudinal responses and time-to-event outcomes. For the longitudinal model of continuous and binary responses, we investigate multivariate generalized linear mixed models using shared random effects. Longitudinal and time-to-event data are assumed to be independent conditional on available covariates and shared parameters. A Markov chain Monte Carlo algorithm, implemented in OpenBUGS, is used for parameter estimation. To illustrate practical considerations in choosing a final model, we fit 37 different candidate models using all possible combinations of random effects and employ a deviance information criterion to select a best-fitting model. We demonstrate the prediction of future event probabilities within a fixed time interval for patients utilizing baseline data, post-baseline longitudinal responses, and the time-to-event outcome. The performance of our joint model is also evaluated in simulation studies.  相似文献   

11.
In clinical practice, the profile of each subject's CD4 response from a longitudinal study may follow a ‘broken stick’ like trajectory, indicating multiple phases of increase and/or decline in response. Such multiple phases (changepoints) may be important indicators to help quantify treatment effect and improve management of patient care. Although it is a common practice to analyze complex AIDS longitudinal data using nonlinear mixed-effects (NLME) or nonparametric mixed-effects (NPME) models in the literature, NLME or NPME models become a challenge to estimate changepoint due to complicated structures of model formulations. In this paper, we propose a changepoint mixed-effects model with random subject-specific parameters, including the changepoint for the analysis of longitudinal CD4 cell counts for HIV infected subjects following highly active antiretroviral treatment. The longitudinal CD4 data in this study may exhibit departures from symmetry, may encounter missing observations due to various reasons, which are likely to be non-ignorable in the sense that missingness may be related to the missing values, and may be censored at the time of the subject going off study-treatment, which is a potentially informative dropout mechanism. Inferential procedures can be complicated dramatically when longitudinal CD4 data with asymmetry (skewness), incompleteness and informative dropout are observed in conjunction with an unknown changepoint. Our objective is to address the simultaneous impact of skewness, missingness and informative censoring by jointly modeling the CD4 response and dropout time processes under a Bayesian framework. The method is illustrated using a real AIDS data set to compare potential models with various scenarios, and some interested results are presented.  相似文献   

12.
Joint models for longitudinal and time-to-event data have been applied in many different fields of statistics and clinical studies. However, the main difficulty these models have to face with is the computational problem. The requirement for numerical integration becomes severe when the dimension of random effects increases. In this paper, a modified two-stage approach has been proposed to estimate the parameters in joint models. In particular, in the first stage, the linear mixed-effects models and best linear unbiased predictorsare applied to estimate parameters in the longitudinal submodel. In the second stage, an approximation of the fully joint log-likelihood is proposed using the estimated the values of these parameters from the longitudinal submodel. Survival parameters are estimated bymaximizing the approximation of the fully joint log-likelihood. Simulation studies show that the approach performs well, especially when the dimension of random effects increases. Finally, we implement this approach on AIDS data.  相似文献   

13.
In a joint analysis of longitudinal quality of life (QoL) scores and relapse-free survival (RFS) times from a clinical trial on early breast cancer conducted by the Canadian Cancer Trials Group, we observed a complicated trajectory of QoL scores and existence of long-term survivors. Motivated by this observation, we proposed in this paper a flexible joint model for the longitudinal measurements and survival times. A partly linear mixed effect model is used to capture the complicated but smooth trajectory of longitudinal measurements and approximated by B-splines and a semiparametric mixture cure model with the B-spline baseline hazard to model survival times with a cure fraction. These two models are linked by shared random effects to explore the dependence between longitudinal measurements and survival times. A semiparametric inference procedure with an EM algorithm is proposed to estimate the parameters in the joint model. The performance of proposed procedures are evaluated by simulation studies and through the application to the analysis of data from the clinical trial which motivated this research.  相似文献   

14.
Motivated by the joint analysis of longitudinal quality of life data and recurrence free survival times from a cancer clinical trial, we present in this paper two approaches to jointly model the longitudinal proportional measurements, which are confined in a finite interval, and survival data. Both approaches assume a proportional hazards model for the survival times. For the longitudinal component, the first approach applies the classical linear mixed model to logit transformed responses, while the second approach directly models the responses using a simplex distribution. A semiparametric method based on a penalized joint likelihood generated by the Laplace approximation is derived to fit the joint model defined by the second approach. The proposed procedures are evaluated in a simulation study and applied to the analysis of breast cancer data motivated this research.  相似文献   

15.
We propose a flexible functional approach for modelling generalized longitudinal data and survival time using principal components. In the proposed model the longitudinal observations can be continuous or categorical data, such as Gaussian, binomial or Poisson outcomes. We generalize the traditional joint models that treat categorical data as continuous data by using some transformations, such as CD4 counts. The proposed model is data-adaptive, which does not require pre-specified functional forms for longitudinal trajectories and automatically detects characteristic patterns. The longitudinal trajectories observed with measurement error or random error are represented by flexible basis functions through a possibly nonlinear link function, combining dimension reduction techniques resulting from functional principal component (FPC) analysis. The relationship between the longitudinal process and event history is assessed using a Cox regression model. Although the proposed model inherits the flexibility of non-parametric methods, the estimation procedure based on the EM algorithm is still parametric in computation, and thus simple and easy to implement. The computation is simplified by dimension reduction for random coefficients or FPC scores. An iterative selection procedure based on Akaike information criterion (AIC) is proposed to choose the tuning parameters, such as the knots of spline basis and the number of FPCs, so that appropriate degree of smoothness and fluctuation can be addressed. The effectiveness of the proposed approach is illustrated through a simulation study, followed by an application to longitudinal CD4 counts and survival data which were collected in a recent clinical trial to compare the efficiency and safety of two antiretroviral drugs.  相似文献   

16.
We propose a semiparametric approach based on proportional hazards and copula method to jointly model longitudinal outcomes and the time-to-event. The dependence between the longitudinal outcomes on the covariates is modeled by a copula-based times series, which allows non-Gaussian random effects and overcomes the limitation of the parametric assumptions in existing linear and nonlinear random effects models. A modified partial likelihood method using estimated covariates at failure times is employed to draw statistical inference. The proposed model and method are applied to analyze a set of progression to AIDS data in a study of the association between the human immunodeficiency virus viral dynamics and the time trend in the CD4/CD8 ratio with measurement errors. Simulations are also reported to evaluate the proposed model and method.  相似文献   

17.
We present a new class of models to fit longitudinal data, obtained with a suitable modification of the classical linear mixed-effects model. For each sample unit, the joint distribution of the random effect and the random error is a finite mixture of scale mixtures of multivariate skew-normal distributions. This extension allows us to model the data in a more flexible way, taking into account skewness, multimodality and discrepant observations at the same time. The scale mixtures of skew-normal form an attractive class of asymmetric heavy-tailed distributions that includes the skew-normal, skew-Student-t, skew-slash and the skew-contaminated normal distributions as special cases, being a flexible alternative to the use of the corresponding symmetric distributions in this type of models. A simple efficient MCMC Gibbs-type algorithm for posterior Bayesian inference is employed. In order to illustrate the usefulness of the proposed methodology, two artificial and two real data sets are analyzed.  相似文献   

18.
Mixture of linear regression models provide a popular treatment for modeling nonlinear regression relationship. The traditional estimation of mixture of regression models is based on Gaussian error assumption. It is well known that such assumption is sensitive to outliers and extreme values. To overcome this issue, a new class of finite mixture of quantile regressions (FMQR) is proposed in this article. Compared with the existing Gaussian mixture regression models, the proposed FMQR model can provide a complete specification on the conditional distribution of response variable for each component. From the likelihood point of view, the FMQR model is equivalent to the finite mixture of regression models based on errors following asymmetric Laplace distribution (ALD), which can be regarded as an extension to the traditional mixture of regression models with normal error terms. An EM algorithm is proposed to obtain the parameter estimates of the FMQR model by combining a hierarchical representation of the ALD. Finally, the iterated weighted least square estimation for each mixture component of the FMQR model is derived. Simulation studies are conducted to illustrate the finite sample performance of the estimation procedure. Analysis of an aphid data set is used to illustrate our methodologies.  相似文献   

19.
Gastric emptying studies are frequently used in medical research, both human and animal, when evaluating the effectiveness and determining the unintended side-effects of new and existing medications, diets, and procedures or interventions. It is essential that gastric emptying data be appropriately summarized before making comparisons between study groups of interest and to allow study the comparisons. Since gastric emptying data have a nonlinear emptying curve and are longitudinal data, nonlinear mixed effect (NLME) models can accommodate both the variation among measurements within individuals and the individual-to-individual variation. However, the NLME model requires strong assumptions that are often not satisfied in real applications that involve a relatively small number of subjects, have heterogeneous measurement errors, or have large variation among subjects. Therefore, we propose three semiparametric Bayesian NLMEs constructed with Dirichlet process priors, which automatically cluster sub-populations and estimate heterogeneous measurement errors. To compare three semiparametric models with the parametric model we propose a penalized posterior Bayes factor. We compare the performance of our semiparametric hierarchical Bayesian approaches with that of the parametric Bayesian hierarchical approach. Simulation results suggest that our semiparametric approaches are more robust and flexible. Our gastric emptying studies from equine medicine are used to demonstrate the advantage of our approaches.  相似文献   

20.
The Cox proportional hazards model is widely used in time-to-event analysis. Two time scales are used in practice: time-on-study and chronological age. The former is the most frequently used time scale in clinical studies and longitudinal observation studies. However, there is no general consensus about which time scale is the best. It has been asserted that if the cumulative baseline hazard is exponential or if the age-at-entry is independent of the covariate, then the two models are equivalent. We show that neither of these conditions leads to equivalency. Variability in the age-at-entry of individuals in the study causes the models to differ significantly. This is shown both analytically and through a simulation study. Additionally, we show that the time-on-study model is more robust to changes in age-at-entry than the chronological age model.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号