首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 125 毫秒
1.
Abstract

It is one of the important issues in survival analysis to compare two hazard rate functions to evaluate treatment effect. It is quite common that the two hazard rate functions cross each other at one or more unknown time points, representing temporal changes of the treatment effect. In certain applications, besides survival data, we also have related longitudinal data available regarding some time-dependent covariates. In such cases, a joint model that accommodates both types of data can allow us to infer the association between the survival and longitudinal data and to assess the treatment effect better. In this paper, we propose a modelling approach for comparing two crossing hazard rate functions by joint modelling survival and longitudinal data. Maximum likelihood estimation is used in estimating the parameters of the proposed joint model using the EM algorithm. Asymptotic properties of the maximum likelihood estimators are studied. To illustrate the virtues of the proposed method, we compare the performance of the proposed method with several existing methods in a simulation study. Our proposed method is also demonstrated using a real dataset obtained from an HIV clinical trial.  相似文献   

2.
The use of relevance vector machines to flexibly model hazard rate functions is explored. This technique is adapted to survival analysis problems through the partial logistic approach. The method exploits the Bayesian automatic relevance determination procedure to obtain sparse solutions and it incorporates the flexibility of kernel-based models. Example results are presented on literature data from a head-and-neck cancer survival study using Gaussian and spline kernels. Sensitivity analysis is conducted to assess the influence of hyperprior distribution parameters. The proposed method is then contrasted with other flexible hazard regression methods, in particular the HARE model proposed by Kooperberg et al. [16]. A simulation study is conducted to carry out the comparison. The model developed in this paper exhibited good performance in the prediction of hazard rate. The application of this sparse Bayesian technique to a real cancer data set demonstrated that the proposed method can potentially reveal characteristics of the hazards, associated with the dynamics of the studied diseases, which may be missed by existing modeling approaches based on different perspectives on the bias vs. variance balance.  相似文献   

3.
The phenomenon of crossing hazard rates is common in clinical trials with time to event endpoints. Many methods have been proposed for testing equality of hazard functions against a crossing hazards alternative. However, there has been relatively few approaches available in the literature for point or interval estimation of the crossing time point. The problem of constructing confidence intervals for the first crossing time point of two hazard functions is considered in this paper. After reviewing a recent procedure based on Cox proportional hazard modeling with Box-Cox transformation of the time to event, a nonparametric procedure using the kernel smoothing estimate of the hazard ratio is proposed. The proposed procedure and the one based on Cox proportional hazard modeling with Box-Cox transformation of the time to event are both evaluated by Monte–Carlo simulations and applied to two clinical trial datasets.  相似文献   

4.
This paper discusses the goodness-of-fit test for the proportional odds model for K-sample interval-censored failure time data, which frequently occur in, for example, periodic follow-up survival studies. The proportional odds model has a feature that allows the ratio of two hazard functions to be monotonic and converge to one and provides an important tool for the modeling of survival data. To test the model, a procedure is proposed, which is a generalization of the method given in Dauxois and Kirmani [Dauxois JY, Kirmani SNUA (2003) Biometrika 90:913–922]. The asymptotic distribution of the procedure is established and its properties are evaluated by simulation studies  相似文献   

5.
It is of interest that researchers study competing risks in which subjects may fail from any one of k causes. Comparing any two competing risks with covariate effects is very important in medical studies. In this paper, we develop tests for comparing cause-specific hazard rates and cumulative incidence functions at specified covariate levels under the additive risk model by a weighted difference of estimates of cumulative cause-specific hazard rates. Motivated by McKeague et al. (2001), we construct simultaneous confidence bands for the difference of two conditional cumulative incidence functions as a useful graphical tool. In addition, we conduct a simulation study, and the simulation result shows that the proposed procedure has a good finite sample performance. A melanoma data set in clinical trial is used for the purpose of illustration.  相似文献   

6.
Abstract

In this paper we find the maximum likelihood estimates (MLEs) of hazard rate and mean residual life functions (MRLF) of Pareto distribution, their asymptotic non degenerate distribution, exact distribution and moments. We also discuss the uniformly minimum variance unbiased estimate (UMVUE) of hazard rate function and MRLF. Finally, two numerical examples with simulated data and real data set, are presented to illustrate the proposed estimates.  相似文献   

7.
Recently, authors have studied inequalities involving expectations of selected functions, viz. failure rate, mean residual life, aging intensity function, and log-odds rate which are defined for left truncated random variables in reliability theory to characterize some well-known distributions. However, there has been growing interest in the study of these functions in reversed time (X ? x, instead of X > x) and their applications. In the present work we consider reversed hazard rate, expected inactivity time, and reversed aging intensity function to deal with right truncated random variables and characterize a few statistical distributions.  相似文献   

8.
In this paper, the estimation of parameters, reliability and hazard functions of a inverted exponentiated half logistic distribution (IEHLD) from progressive Type II censored data has been considered. The Bayes estimates for progressive Type II censored IEHLD under asymmetric and symmetric loss functions such as squared error, general entropy and linex loss function are provided. The Bayes estimates for progressive Type II censored IEHLD parameters, reliability and hazard functions are also obtained under the balanced loss functions. However, the Bayes estimates cannot be obtained explicitly, Lindley approximation method and importance sampling procedure are considered to obtain the Bayes estimates. Furthermore, the asymptotic normality of the maximum likelihood estimates is used to obtain the approximate confidence intervals. The highest posterior density credible intervals of the parameters based on importance sampling procedure are computed. Simulations are performed to see the performance of the proposed estimates. For illustrative purposes, two data sets have been analyzed.  相似文献   

9.
Recently, Domma et al. [An extension of Azzalinis method, J. Comput. Appl. Math. 278 (2015), pp. 37–47] proposed an extension of Azzalini's method. This method can attract readers due to its flexibility and ease of applicability. Most of the weighted Weibull models that have been introduced are with monotonic hazard rate function. This fact limits their applicability. So, our aim is to build a new weighted Weibull distribution with monotonic and non-monotonic hazard rate function. A new weighted Weibull distribution, so-called generalized weighted Weibull (GWW) distribution, is introduced by a method exposed in Domma et al. [13]. GWW distribution possesses decreasing, increasing, upside-down bathtub, N-shape and M-shape hazard rate. Also, it is very easy to derive statistical properties of the GWW distribution. Finally, we consider application of the GWW model on a real data set, providing simulation study too.  相似文献   

10.
We propose a new procedure for combining multiple tests in samples of right-censored observations. The new method is based on multiple constrained censored empirical likelihood where the constraints are formulated as linear functionals of the cumulative hazard functions. We prove a version of Wilks’ theorem for the multiple constrained censored empirical likelihood ratio, which provides a simple reference distribution for the test statistic of our proposed method. A useful application of the proposed method is, for example, examining the survival experience of different populations by combining different weighted log-rank tests. Real data examples are given using the log-rank and Gehan-Wilcoxon tests. In a simulation study of two sample survival data, we compare the proposed method of combining tests to previously developed procedures. The results demonstrate that, in addition to its computational simplicity, the combined test performs comparably to, and in some situations more reliably than previously developed procedures. Statistical software is available in the R package ‘emplik’.  相似文献   

11.
The case-cohort design is widely used as a means of reducing the cost in large cohort studies, especially when the disease rate is low and covariate measurements may be expensive, and has been discussed by many authors. In this paper, we discuss regression analysis of case-cohort studies that produce interval-censored failure time with dependent censoring, a situation for which there does not seem to exist an established approach. For inference, a sieve inverse probability weighting estimation procedure is developed with the use of Bernstein polynomials to approximate the unknown baseline cumulative hazard functions. The proposed estimators are shown to be consistent and the asymptotic normality of the resulting regression parameter estimators is established. A simulation study is conducted to assess the finite sample properties of the proposed approach and indicates that it works well in practical situations. The proposed method is applied to an HIV/AIDS case-cohort study that motivated this investigation.  相似文献   

12.
In 2008, Marsan and Lengliné presented a nonparametric way to estimate the triggering function of a Hawkes process. Their method requires an iterative and computationally intensive procedure which ultimately produces only approximate maximum likelihood estimates (MLEs) whose asymptotic properties are poorly understood. Here, we note a mathematical curiosity that allows one to compute, directly and extremely rapidly, exact MLEs of the nonparametric triggering function. The method here requires that the number q of intervals on which the nonparametric estimate is sought equals the number n of observed points. The resulting estimates have very high variance but may be smoothed to form more stable estimates. The performance and computational efficiency of the proposed method is verified in two disparate, highly challenging simulation scenarios: first to estimate the triggering functions, with simulation-based 95% confidence bands, for earthquakes and their aftershocks in Loma Prieta, California, and second, to characterise triggering in confirmed cases of plague in the United States over the last century. In both cases, the proposed estimator can be used to describe the rate of contagion of the processes in detail, and the computational efficiency of the estimator facilitates the construction of simulation-based confidence intervals.  相似文献   

13.
In this article, we deal with a two-parameter exponentiated half-logistic distribution. We consider the estimation of unknown parameters, the associated reliability function and the hazard rate function under progressive Type II censoring. Maximum likelihood estimates (M LEs) are proposed for unknown quantities. Bayes estimates are derived with respect to squared error, linex and entropy loss functions. Approximate explicit expressions for all Bayes estimates are obtained using the Lindley method. We also use importance sampling scheme to compute the Bayes estimates. Markov Chain Monte Carlo samples are further used to produce credible intervals for the unknown parameters. Asymptotic confidence intervals are constructed using the normality property of the MLEs. For comparison purposes, bootstrap-p and bootstrap-t confidence intervals are also constructed. A comprehensive numerical study is performed to compare the proposed estimates. Finally, a real-life data set is analysed to illustrate the proposed methods of estimation.  相似文献   

14.
In this paper, we study the properties of a special class of frailty models when the frailty is common to several failure times. The models are closely linked to Archimedean copula models. We establish a useful formula for cumulative baseline hazard functions and develop a new estimator for cumulative baseline hazard functions in bivariate frailty regression models. Based on our proposed estimator, we present a graphical model checking procedure. We fit a leukemia data set using our model and end our paper with some discussions.  相似文献   

15.
In this paper, we study ordering properties of lifetimes of parallel systems with two independent heterogeneous exponential components in terms of the likelihood ratio order (reversed hazard rate order) and the hazard rate order (stochastic order). We establish, among others, that the weakly majorization order between two hazard rate vectors is equivalent to the likelihood ratio order (reversed hazard rate order) between lifetimes of two parallel systems, and that the p-larger order between two hazard rate vectors is equivalent to the hazard rate order (stochastic order) between lifetimes of two parallel systems. Moreover, we extend the results to the proportional hazard rate models. The results derived here strengthen and generalize some of the results known in the literature.  相似文献   

16.
The proportional reversed hazards model explains the multiplicative effect of covariates on the baseline reversed hazard rate function of lifetimes. In the present study, we introduce a proportional cause-specific reversed hazards model. The proposed regression model facilitates the analysis of failure time data with multiple causes of failure under left censoring. We estimate the regression parameters using a partial likelihood approach. We provide Breslow's type estimators for the cumulative cause-specific reversed hazard rate functions. Asymptotic properties of the estimators are discussed. Simulation studies are conducted to assess their performance. We illustrate the applicability of the proposed model using a real data set.  相似文献   

17.
We consider the problem of estimating unknown parameters, reliability function and hazard function of a two parameter bathtub-shaped distribution on the basis of progressive type-II censored sample. The maximum likelihood estimators and Bayes estimators are derived for two unknown parameters, reliability function and hazard function. The Bayes estimators are obtained against squared error, LINEX and entropy loss functions. Also, using the Lindley approximation method we have obtained approximate Bayes estimators against these loss functions. Some numerical comparisons are made among various proposed estimators in terms of their mean square error values and some specific recommendations are given. Finally, two data sets are analyzed to illustrate the proposed methods.  相似文献   

18.
In this paper we compare the hazard rate functions of two parallel systems, each of which consists of two independent components with exponential distribution functions. The paper gives various conditions under which there exists a hazard rate ordering between the two parallel systems. It is also shown that some of these conditions are both sufficient and necessary. In particular, it is proven that if the vector consisting of the two hazard rates of the two exponential components in one parallel system weakly supmajorizes the counterpart of the other parallel system, then the first parallel system is greater than the second parallel system in the hazard rate ordering. This paper further compares the hazard rate functions of two parallel systems when both systems have components following a certain bivariate exponential distribution.  相似文献   

19.
Two-parameter Gompertz distribution has been introduced as a lifetime model for reliability inference recently. In this paper, the Gompertz distribution is proposed for the baseline lifetimes of components in a composite system. In this composite system, failure of a component induces increased load on the surviving components and thus increases component hazard rate via a power-trend process. Point estimates of the composite system parameters are obtained by the method of maximum likelihood. Interval estimates of the baseline survival function are obtained by using the maximum-likelihood estimator via a bootstrap percentile method. Two parametric bootstrap procedures are proposed to test whether the hazard rate function changes with the number of failed components. Intensive simulations are carried out to evaluate the performance of the proposed estimation procedure.  相似文献   

20.
This article considers a k level step-stress accelerated life testing (ALT) on series system products, where independent Weibull-distributed lifetimes are assumed for the components. Due to cost considerations or environmental restrictions, causes of system failures are masked and type-I censored observations might occur in the collected data. Bayesian approach combined with auxiliary variables is developed for estimating the parameters of the model. Further, the reliability and hazard rate functions of the system and components are estimated at a specified time at use stress level. The proposed method is illustrated through a numerical example based on two priors and various masking probabilities.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号