首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The family of power series cure rate models provides a flexible modeling framework for survival data of populations with a cure fraction. In this work, we present a simplified estimation procedure for the maximum likelihood (ML) approach. ML estimates are obtained via the expectation-maximization (EM) algorithm where the expectation step involves computation of the expected number of concurrent causes for each individual. It has the big advantage that the maximization step can be decomposed into separate maximizations of two lower-dimensional functions of the regression and survival distribution parameters, respectively. Two simulation studies are performed: the first to investigate the accuracy of the estimation procedure for different numbers of covariates and the second to compare our proposal with the direct maximization of the observed log-likelihood function. Finally, we illustrate the technique for parameter estimation on a dataset of survival times for patients with malignant melanoma.  相似文献   

2.
In this paper we develop a regression model for survival data in the presence of long-term survivors based on the generalized Gompertz distribution introduced by El-Gohary et al. [The generalized Gompertz distribution. Appl Math Model. 2013;37:13–24] in a defective version. This model includes as special case the Gompertz cure rate model proposed by Gieser et al. [Modelling cure rates using the Gompertz model with covariate information. Stat Med. 1998;17:831–839]. Next, an expectation maximization algorithm is then developed for determining the maximum likelihood estimates (MLEs) of the parameters of the model. In addition, we discuss the construction of confidence intervals for the parameters using the asymptotic distributions of the MLEs and the parametric bootstrap method, and assess their performance through a Monte Carlo simulation study. Finally, the proposed methodology was applied to a database on uterine cervical cancer.  相似文献   

3.
4.
The skew-generalized-normal distribution [Arellano-Valle, RB, Gómez, HW, Quintana, FA. A new class of skew-normal distributions. Comm Statist Theory Methods 2004;33(7):1465–1480] is a class of asymmetric normal distributions, which contains the normal and skew-normal distributions as special cases. The main virtues of this distribution is that it is easy to simulate from and it also supplies a genuine expectation–maximization (EM) algorithm for maximum likelihood estimation. In this paper, we extend the EM algorithm for linear regression models assuming skew-generalized-normal random errors and we develop a diagnostics analyses via local influence and generalized leverage, following Zhu and Lee's approach. This is because Cook's well-known approach would be more complicated to use to obtain measures of local influence. Finally, results obtained for a real data set are reported, illustrating the usefulness of the proposed method.  相似文献   

5.
In this paper we introduce a new three-parameter exponential-type distribution. The new distribution is quite flexible and can be used effectively in modeling survival data and reliability problems. It can have constant, decreasing, increasing, upside-down bathtub and bathtub-shaped hazard rate functions. It also generalizes some well-known distributions. We discuss maximum likelihood estimation of the model parameters for complete sample and for censored sample. Additionally, we formulate a new cure rate survival model by assuming that the number of competing causes of the event of interest has the Poisson distribution and the time to this event follows the proposed distribution. Maximum likelihood estimation of the model parameters of the new cure rate survival model is discussed for complete sample and censored sample. Two applications to real data are provided to illustrate the flexibility of the new model in practice.  相似文献   

6.
In this paper, we consider the four-parameter bivariate generalized exponential distribution proposed by Kundu and Gupta [Bivariate generalized exponential distribution, J. Multivariate Anal. 100 (2009), pp. 581–593] and propose an expectation–maximization algorithm to find the maximum-likelihood estimators of the four parameters under random left censoring. A numerical experiment is carried out to discuss the properties of the estimators obtained iteratively.  相似文献   

7.
This paper addresses the problem of identifying groups that satisfy the specific conditions for the means of feature variables. In this study, we refer to the identified groups as “target clusters” (TCs). To identify TCs, we propose a method based on the normal mixture model (NMM) restricted by a linear combination of means. We provide an expectation–maximization (EM) algorithm to fit the restricted NMM by using the maximum-likelihood method. The convergence property of the EM algorithm and a reasonable set of initial estimates are presented. We demonstrate the method's usefulness and validity through a simulation study and two well-known data sets. The proposed method provides several types of useful clusters, which would be difficult to achieve with conventional clustering or exploratory data analysis methods based on the ordinary NMM. A simple comparison with another target clustering approach shows that the proposed method is promising in the identification.  相似文献   

8.
In this article, we consider a parametric survival model that is appropriate when the population of interest contains long-term survivors or immunes. The model referred to as the cure rate model was introduced by Boag 1 Boag, J. W. 1949. Maximum likelihood estimates of the proportion of patients cured by cancer therapy. J. R. Stat. Soc. Ser. B, 11: 1553.  [Google Scholar] in terms of a mixture model that included a component representing the proportion of immunes and a distribution representing the life times of the susceptible population. We propose a cure rate model based on the generalized exponential distribution that incorporates the effects of risk factors or covariates on the probability of an individual being a long-time survivor. Maximum likelihood estimators of the model parameters are obtained using the the expectation-maximisation (EM) algorithm. A graphical method is also provided for assessing the goodness-of-fit of the model. We present an example to illustrate the fit of this model to data that examines the effects of different risk factors on relapse time for drug addicts.  相似文献   

9.
In this article, we propose mixtures of skew Laplace normal (SLN) distributions to model both skewness and heavy-tailedness in the neous data set as an alternative to mixtures of skew Student-t-normal (STN) distributions. We give the expectation–maximization (EM) algorithm to obtain the maximum likelihood (ML) estimators for the parameters of interest. We also analyze the mixture regression model based on the SLN distribution and provide the ML estimators of the parameters using the EM algorithm. The performance of the proposed mixture model is illustrated by a simulation study and two real data examples.  相似文献   

10.
By combining the progressive hybrid censoring with the step-stress partially accelerated lifetime test, we propose an adaptive step-stress partially accelerated lifetime test, which allows random changing of the number of step-stress levels according to the pre-fixed censoring number and time points. Thus, the time expenditure and economic cost of the test will be reduced greatly. Based on the Lindley-distributed tampered failure rate (TFR) model with masked system lifetime data, the BFGS method is introduced in the expectation maximization (EM) algorithm to obtain the maximum likelihood estimation (MLE), which overcomes the difficulties of the vague maximization procedure in the M-step. Asymptotic confidence intervals of components' distribution parameters are also investigated according to the missing information principle. As comparison, the Bayesian estimation and the highest probability density (HPD) credible intervals are obtained by using adaptive rejection sampling. Furthermore, the reliability of the system and components are estimated at a specified time under usual and severe operating conditions. Finally, a numerical simulation example is presented to illustrate the performance of our proposed method.  相似文献   

11.
In this paper we study the cure rate survival model involving a competitive risk structure with missing categorical covariates. A parametric distribution that can be written as a sequence of one-dimensional conditional distributions is specified for the missing covariates. We consider the missing data at random situation so that the missing covariates may depend only on the observed ones. Parameter estimates are obtained by using the EM algorithm via the method of weights. Extensive simulation studies are conducted and reported to compare estimates efficiency with and without missing data. As expected, the estimation approach taking into consideration the missing covariates presents much better efficiency in terms of mean square errors than the complete case situation. Effects of increasing cured fraction and censored observations are also reported. We demonstrate the proposed methodology with two real data sets. One involved the length of time to obtain a BS degree in Statistics, and another about the time to breast cancer recurrence.  相似文献   

12.
We present a maximum likelihood estimation procedure for the multivariate frailty model. The estimation is based on a Monte Carlo EM algorithm. The expectation step is approximated by averaging over random samples drawn from the posterior distribution of the frailties using rejection sampling. The maximization step reduces to a standard partial likelihood maximization. We also propose a simple rule based on the relative change in the parameter estimates to decide on sample size in each iteration and a stopping time for the algorithm. An important new concept is acquiring absolute convergence of the algorithm through sample size determination and an efficient sampling technique. The method is illustrated using a rat carcinogenesis dataset and data on vase lifetimes of cut roses. The estimation results are compared with approximate inference based on penalized partial likelihood using these two examples. Unlike the penalized partial likelihood estimation, the proposed full maximum likelihood estimation method accounts for all the uncertainty while estimating standard errors for the parameters.  相似文献   

13.
Two-component mixture cure rate model is popular in cure rate data analysis with the proportional hazards and accelerated failure time (AFT) models being the major competitors for modelling the latency component. [Wang, L., Du, P., and Liang, H. (2012), ‘Two-Component Mixture Cure Rate Model with Spline Estimated Nonparametric Components’, Biometrics, 68, 726–735] first proposed a nonparametric mixture cure rate model where the latency component assumes proportional hazards with nonparametric covariate effects in the relative risk. Here we consider a mixture cure rate model where the latency component assumes AFTs with nonparametric covariate effects in the acceleration factor. Besides the more direct physical interpretation than the proportional hazards, our model has an additional scalar parameter which adds more complication to the computational algorithm as well as the asymptotic theory. We develop a penalised EM algorithm for estimation together with confidence intervals derived from the Louis formula. Asymptotic convergence rates of the parameter estimates are established. Simulations and the application to a melanoma study shows the advantages of our new method.  相似文献   

14.
The modeling and analysis of lifetime data in which the main endpoints are the times when an event of interest occurs is of great interest in medical studies. In these studies, it is common that two or more lifetimes associated with the same unit such as the times to deterioration levels or the times to reaction to a treatment in pairs of organs like lungs, kidneys, eyes or ears. In medical applications, it is also possible that a cure rate is present and needed to be modeled with lifetime data with long-term survivors. This paper presented a comparative study under a Bayesian approach among some existing continuous and discrete bivariate distributions such as the bivariate exponential distributions and the bivariate geometric distributions in presence of cure rate, censored data and covariates. In presence of lifetimes related to cured patients, it is assumed standard mixture cure rate models in the data analysis. The posterior summaries of interest are obtained using Markov Chain Monte Carlo methods. To illustrate the proposed methodology two real medical data sets are considered.  相似文献   

15.
Abstract

In this paper, we establish that the usual stochastic, hazard rate, reversed hazard rate, likelihood ratio, dispersive and star orders are all preserved for parallel systems under exponentiated models for lifetimes of components. We then use the multiple-outlier exponentiated gamma models to illustrate this result. Finally, we consider the dual family with exponentiated survival function and establish similar results for series systems. The results established here extend some well-known results for series and parallel systems arising from different exponentiated distributions such as generalized exponential and exponentiated Weibull, established previously in the literature.  相似文献   

16.
Clustered binary data are common in medical research and can be fitted to the logistic regression model with random effects which belongs to a wider class of models called the generalized linear mixed model. The likelihood-based estimation of model parameters often has to handle intractable integration which leads to several estimation methods to overcome such difficulty. The penalized quasi-likelihood (PQL) method is the one that is very popular and computationally efficient in most cases. The expectation–maximization (EM) algorithm allows to estimate maximum-likelihood estimates, but requires to compute possibly intractable integration in the E-step. The variants of the EM algorithm to evaluate the E-step are introduced. The Monte Carlo EM (MCEM) method computes the E-step by approximating the expectation using Monte Carlo samples, while the Modified EM (MEM) method computes the E-step by approximating the expectation using the Laplace's method. All these methods involve several steps of approximation so that corresponding estimates of model parameters contain inevitable errors (large or small) induced by approximation. Understanding and quantifying discrepancy theoretically is difficult due to the complexity of approximations in each method, even though the focus is on clustered binary data. As an alternative competing computational method, we consider a non-parametric maximum-likelihood (NPML) method as well. We review and compare the PQL, MCEM, MEM and NPML methods for clustered binary data via simulation study, which will be useful for researchers when choosing an estimation method for their analysis.  相似文献   

17.
Abstract

Recently, the study of the lifetime of systems in reliability and survival analysis in the presence of several causes of failure (competing risks) has attracted attention in the literature. In this paper, series and parallel systems with exponential lifetime for each item of the system are considered. Several causes of failure independently affect lifetime distributions and observations of failure times of the systems are considered under progressive Type-II censored scheme. For series systems, the maximum likelihood estimates of parameters are computed and confidence intervals for parameters of the model are obtained using Fisher information matrix. For parallel systems, the generalized EM algorithm which uses the Newton-Raphson algorithm inside the EM algorithm is used to compute the maximum likelihood estimates of parameters. Also, the standard errors of the maximum likelihood estimates are computed by using the supplemented EM algorithm. The simulation study confirms the good performance of the introduced approach.  相似文献   

18.
Models for dealing with survival data in the presence of a cured fraction of individuals have attracted the attention of many researchers and practitioners in recent years. In this paper, we propose a cure rate model under the competing risks scenario. For the number of causes that can lead to the event of interest, we assume the polylogarithm distribution. The model is flexible in the sense it encompasses some well-known models, which can be tested using large sample test statistics applied to nested models. Maximum-likelihood estimation based on the EM algorithm and hypothesis testing are investigated. Results of simulation studies designed to gauge the performance of the estimation method and of two test statistics are reported. The methodology is applied in the analysis of a data set.  相似文献   

19.
Estimators derived from the expectation‐maximization (EM) algorithm are not robust since they are based on the maximization of the likelihood function. We propose an iterative proximal‐point algorithm based on the EM algorithm to minimize a divergence criterion between a mixture model and the unknown distribution that generates the data. The algorithm estimates in each iteration the proportions and the parameters of the mixture components in two separate steps. Resulting estimators are generally robust against outliers and misspecification of the model. Convergence properties of our algorithm are studied. The convergence of the introduced algorithm is discussed on a two‐component Weibull mixture entailing a condition on the initialization of the EM algorithm in order for the latter to converge. Simulations on Gaussian and Weibull mixture models using different statistical divergences are provided to confirm the validity of our work and the robustness of the resulting estimators against outliers in comparison to the EM algorithm. An application to a dataset of velocities of galaxies is also presented. The Canadian Journal of Statistics 47: 392–408; 2019 © 2019 Statistical Society of Canada  相似文献   

20.
Abstract.  We propose a Bayesian semiparametric model for survival data with a cure fraction. We explicitly consider a finite cure time in the model, which allows us to separate the cured and the uncured populations. We take a mixture prior of a Markov gamma process and a point mass at zero to model the baseline hazard rate function of the entire population. We focus on estimating the cure threshold after which subjects are considered cured. We can incorporate covariates through a structure similar to the proportional hazards model and allow the cure threshold also to depend on the covariates. For illustration, we undertake simulation studies and a full Bayesian analysis of a bone marrow transplant data set.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号