首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Previously, we developed a modeling framework which classifies individuals with respect to their length of stay (LOS) in the transient states of a continuous-time Markov model with a single absorbing state; phase-type models are used for each class of the Markov model. We here add costs and obtain results for moments of total costs in (0, t], for an individual, a cohort arriving at time zero and when arrivals are Poisson. Based on stroke patient data from the Belfast City Hospital we use the overall modelling framework to obtain results for total cost in a given time interval.  相似文献   

2.
Odile Pons 《Statistics》2013,47(4):273-293
A semi-Markov model with covariates is proposed for a multi-state process with a finite number of states such that the transition probabilities between the states and the distribution functions of the duration times between the occurrence of two states depend on a discrete covariate. The hazard rates for the time elapsed between two successive states depend on the covariate through a proportional hazards model involving a set of regression parameters, while the transition probabilities depend on the covariate in an unspecified way. We propose estimators for these parameters and for the cumulative hazard functions of the sojourn times. A difficulty comes from the fact that when a sojourn time in a state is right-censored, the next state is unknown. We prove that our estimators are consistent and asymptotically Gaussian under the model constraints.  相似文献   

3.
In multistate survival analysis, the sojourn of a patient through various clinical states is shown to correspond to the diffusion of 1 C of electrical charge through an electrical network. The essential comparison has differentials of probability for the patient to correspond to differentials of charge, and it equates clinical states to electrical nodes. Indeed, if the death state of the patient corresponds to the sink node of the circuit, then the transient current that would be seen on an oscilloscope as the sink output is a plot of the probability density for the survival time of the patient. This electrical circuit analogy is further explored by considering the simplest possible survival model with two clinical states, alive and dead (sink), that incorporates censoring and truncation. The sink output seen on an oscilloscope is a plot of the Kaplan–Meier mass function. Thus, the Kaplan–Meier estimator finds motivation from the dynamics of current flow, as a fundamental physical law, rather than as a nonparametric maximum likelihood estimate (MLE). Generalization to competing risks settings with multiple death states (sinks) leads to cause‐specific Kaplan–Meier submass functions as outputs at sink nodes. With covariates present, the electrical analogy provides for an intuitive understanding of partial likelihood and various baseline hazard estimates often used with the proportional hazards model.  相似文献   

4.
Single cohort stage‐frequency data are considered when assessing the stage reached by individuals through destructive sampling. For this type of data, when all hazard rates are assumed constant and equal, Laplace transform methods have been applied in the past to estimate the parameters in each stage‐duration distribution and the overall hazard rates. If hazard rates are not all equal, estimating stage‐duration parameters using Laplace transform methods becomes complex. In this paper, two new models are proposed to estimate stage‐dependent maturation parameters using Laplace transform methods where non‐trivial hazard rates apply. The first model encompasses hazard rates that are constant within each stage but vary between stages. The second model encompasses time‐dependent hazard rates within stages. Moreover, this paper introduces a method for estimating the hazard rate in each stage for the stage‐wise constant hazard rates model. This work presents methods that could be used in specific types of laboratory studies, but the main motivation is to explore the relationships between stage maturation parameters that, in future work, could be exploited in applying Bayesian approaches. The application of the methodology in each model is evaluated using simulated data in order to illustrate the structure of these models.  相似文献   

5.
A cohort of 300 women with breast cancer who were submitted for surgery is analysed by using a non-homogeneous Markov process. Three states are onsidered: no relapse, relapse and death. As relapse times change over time, we have extended previous approaches for a time homogeneous model to a non omogeneous multistate process. The trends of the hazard rate functions of transitions between states increase and then decrease, showing that a changepoint can be considered. Piecewise Weibull distributions are introduced as transition intensity functions. Covariates corresponding to treatments are incorporated in the model multiplicatively via these functions. The likelihood function is built for a general model with k changepoints and applied to the data set, the parameters are estimated and life-table and transition probabilities for treatments in different periods of time are given. The survival probability functions for different treatments are plotted and compared with the corresponding function for the homogeneous model. The survival functions for the various cohorts submitted for treatment are fitted to the mpirical survival functions.  相似文献   

6.
We consider the competing-risks problem without making any assumption concerning the independence of the risks. Maximum-likelihood estimates of the cause-specific hazard rates are obtained under the condition that their ratio is monotone. We also consider the likelihood-ratio test for testing the proportionality of two cause-specific hazard rates against the alternative that the ratio of these two hazard rates is monotonic. This testing problem is equivalent to testing independence against likelihood-ratio dependence of the time to failure and the cause of failure in the competing-risks setup. We allow for random censoring on the right. The asymptotic null distribution of the test statistic is obtained and is found to be of the chi-bar-square type. The problem is extended to the case of more than two risks. A numerical example is given to illustrate the procedure.  相似文献   

7.
The model of independent competing risks provides no information for the assessment of competing failure modes if the failure mechanisms underlying these modes are coupled. Models for dependent competing risks in the literature can be distinguished on the basis of the functional behaviour of the conditional probability of failure due to a particular failure mode given that the failure time exceeds a fixed time, as a function of time. There is an interesting link between monotonicity of such conditional probability and dependence between failure time and failure mode, via crude hazard rates. In this paper, we propose tests for testing the dependence between failure time and failure mode using the crude hazards and using the conditional probabilities mentioned above. We establish the equivalence between the two approaches and provide an asymptotically efficient weight function under a sequence of local alternatives. The tests are applied to simulated data and to mortality follow-up data.  相似文献   

8.
This research focuses on the estimation of tumor incidence rates from long-term animal studies which incorporate interim sacrifices. A nonparametric stochastic model is described with transition rates between states corresponding to the tumor incidence rate, the overall death rate, and the death rate for tumor-free animals. Exact analytic solutions for the maximum likelihood estimators of the hazard rates are presented, and their application to data from a long-term animal study is illustrated by an example. Unlike many common methods for estimation and comparison of tumor incidence rates among treatment groups, the estimators derived in this paper require no assumptions regarding tumor lethality or treatment lethality. The small sample operating characteristics of these estimators are evaluated using Monte Carlo simulation studies.  相似文献   

9.
The National Cancer Institute (NCI) suggests a sudden reduction in prostate cancer mortality rates, likely due to highly successful treatments and screening methods for early diagnosis. We are interested in understanding the impact of medical breakthroughs, treatments, or interventions, on the survival experience for a population. For this purpose, estimating the underlying hazard function, with possible time change points, would be of substantial interest, as it will provide a general picture of the survival trend and when this trend is disrupted. Increasing attention has been given to testing the assumption of a constant failure rate against a failure rate that changes at a single point in time. We expand the set of alternatives to allow for the consideration of multiple change-points, and propose a model selection algorithm using sequential testing for the piecewise constant hazard model. These methods are data driven and allow us to estimate not only the number of change points in the hazard function but where those changes occur. Such an analysis allows for better understanding of how changing medical practice affects the survival experience for a patient population. We test for change points in prostate cancer mortality rates using the NCI Surveillance, Epidemiology, and End Results dataset.  相似文献   

10.
ABSTRACT

In this article, we develop a new method, called regenerative randomization, for the transient analysis of continuous time Markov models with absorbing states. The method has the same good properties as standard randomization: numerical stability, well-controlled computation error, and ability to specify the computation error in advance. The method has a benign behavior for large t and is significantly less costly than standard randomization for large enough models and large enough t. For a class of models, class C, including typical failure/repair reliability models with exponential failure and repair time distributions and repair in every state with failed components, stronger theoretical results are available assessing the efficiency of the method in terms of “visible” model characteristics. A large example belonging to that class is used to illustrate the performance of the method and to show that it can indeed be much faster than standard randomization.  相似文献   

11.
Xiong Cai  Yiying Zhang 《Statistics》2017,51(3):615-626
In this paper, we compare the hazard rate functions of the second-order statistics arising from two sets of independent multiple-outlier proportional hazard rates (PHR) samples. It is proved that the submajorization order between the sample size vectors together with the supermajorization order between the hazard rate vectors imply the hazard rate ordering between the corresponding second-order statistics from multiple-outlier PHR random variables. The results established here provide theoretical guidance both for the winner's price for the bid in the second-price reverse auction in auction theory and fail-safe system design in reliability. Some numerical examples are also provided for illustration.  相似文献   

12.
The hazard function plays an important role in cancer patient survival studies, as it quantifies the instantaneous risk of death of a patient at any given time. Often in cancer clinical trials, unimodal hazard functions are observed, and it is of interest to detect (estimate) the turning point (mode) of hazard function, as this may be an important measure in patient treatment strategies with cancer. Moreover, when patient cure is a possibility, estimating cure rates at different stages of cancer, in addition to their proportions, may provide a better summary of the effects of stages on survival rates. Therefore, the main objective of this paper is to consider the problem of estimating the mode of hazard function of patients at different stages of cervical cancer in the presence of long-term survivors. To this end, a mixture cure rate model is proposed using the log-logistic distribution. The model is conveniently parameterized through the mode of the hazard function, in which cancer stages can affect both the cured fraction and the mode. In addition, we discuss aspects of model inference through the maximum likelihood estimation method. A Monte Carlo simulation study assesses the coverage probability of asymptotic confidence intervals.  相似文献   

13.
Abstract.  Hazard rate estimation is an alternative to density estimation for positive variables that is of interest when variables are times to event. In particular, it is here shown that hazard rate estimation is useful for seismic hazard assessment. This paper suggests a simple, but flexible, Bayesian method for non-parametric hazard rate estimation, based on building the prior hazard rate as the convolution mixture of a Gaussian kernel with an exponential jump-size compound Poisson process. Conditions are given for a compound Poisson process prior to be well-defined and to select smooth hazard rates, an elicitation procedure is devised to assign a constant prior expected hazard rate while controlling prior variability, and a Markov chain Monte Carlo approximation of the posterior distribution is obtained. Finally, the suggested method is validated in a simulation study, and some Italian seismic event data are analysed.  相似文献   

14.
The Weibull, log-logistic and log-normal distributions are extensively used to model time-to-event data. The Weibull family accommodates only monotone hazard rates, whereas the log-logistic and log-normal are widely used to model unimodal hazard functions. The increasing availability of lifetime data with a wide range of characteristics motivate us to develop more flexible models that accommodate both monotone and nonmonotone hazard functions. One such model is the exponentiated Weibull distribution which not only accommodates monotone hazard functions but also allows for unimodal and bathtub shape hazard rates. This distribution has demonstrated considerable potential in univariate analysis of time-to-event data. However, the primary focus of many studies is rather on understanding the relationship between the time to the occurrence of an event and one or more covariates. This leads to a consideration of regression models that can be formulated in different ways in survival analysis. One such strategy involves formulating models for the accelerated failure time family of distributions. The most commonly used distributions serving this purpose are the Weibull, log-logistic and log-normal distributions. In this study, we show that the exponentiated Weibull distribution is closed under the accelerated failure time family. We then formulate a regression model based on the exponentiated Weibull distribution, and develop large sample theory for statistical inference. We also describe a Bayesian approach for inference. Two comparative studies based on real and simulated data sets reveal that the exponentiated Weibull regression can be valuable in adequately describing different types of time-to-event data.  相似文献   

15.
In this paper, a class of tests is developed for comparing the cause-specific hazard rates of m competing risks simultaneously in K ( 2) groups. The data available for a unit are the failure time of the unit along with the identifier of the risk claiming the failure. In practice, the failure time data are generally right censored. The tests are based on the difference between the weighted averages of the cause-specific hazard rates corresponding to each risk. No assumption regarding the dependence of the competing risks is made. It is shown that the proposed test statistic has asymptotically chi-squared distribution. The proposed test is shown to be optimal for a specific type of local alternatives. The choice of weight function is also discussed. A simulation study is carried out using multivariate Gumbel distribution to compare the optimal weight function with a proposed weight function which is to be used in practice. Also, the proposed test is applied to real data on the termination of an intrauterine device.An erratum to this article can be found at  相似文献   

16.
The phenomenon of crossing hazard rates is common in clinical trials with time to event endpoints. Many methods have been proposed for testing equality of hazard functions against a crossing hazards alternative. However, there has been relatively few approaches available in the literature for point or interval estimation of the crossing time point. The problem of constructing confidence intervals for the first crossing time point of two hazard functions is considered in this paper. After reviewing a recent procedure based on Cox proportional hazard modeling with Box-Cox transformation of the time to event, a nonparametric procedure using the kernel smoothing estimate of the hazard ratio is proposed. The proposed procedure and the one based on Cox proportional hazard modeling with Box-Cox transformation of the time to event are both evaluated by Monte–Carlo simulations and applied to two clinical trial datasets.  相似文献   

17.
Models for monotone trends in hazard rates for grouped survival data in stratified populations are introduced, and simple closed form score statistics for testing the significance of these trends are presented. The test statistics for some of the models understudy are shown to be independent of the assumed form of the function which relates the hazard rates to the sets of monotone scores assigned to the time intervals. The procedure is applied to test monotone trends in the recovery rates of erythematous response among skin cancer patients and controls that have been irradiated with a ultraviolent challenge.  相似文献   

18.
In this article, a stock-forecasting model is developed to analyze a company's stock price variation related to the Taiwanese company HTC. The main difference to previous articles is that this study uses the data of the HTC in recent ten years to build a Markov transition matrix. Instead of trying to predict the stock price variation through the traditional approach to the HTC stock problem, we integrate two types of Markov chain that are used in different ways. One is a regular Markov chain, and the other is an absorbing Markov chain. Through a regular Markov chain, we can obtain important information such as what happens in the long run or whether the distribution of the states tends to stabilize over time in an efficient way. Next, we used an artificial variable technique to create an absorbing Markov chain. Thus, we used an absorbing Markov chain to provide information about the period between the increases before arriving at the decreasing state of the HTC stock. We provide investors with information on how long the HTC stock will keep increasing before its price begins to fall, which is extremely important information to them.  相似文献   

19.
Summary For technological applications it can be useful to identify some simple physical mechanisms, which, on the basis of the available knowledge of the production process, may suggest the most appropriate approach to statistical control of the random quantities of interest. For this purpose the notion of rupture point is introduced firstly. A rupture point is characterized bym randomly arising out of control states, assumed to be mutually exclusive and stochastically independent. Shewhart's control charts seem to represent the natural statistical tool for controlling a rupture point; however it is shown that they are fully justified only when the hazard rates attached to the causes of failure are constant. Otherwise, typically in the presence of time increasing hazard rates, Shewhart's control charts should be completed by a preventive intervention rule (preventive maintenance). In the second place, the notion of dynamic instability point is introduced, which is specifically characterized by assuming that the random quantity of interest is ruled by a stochastic differential equation with constant coefficients. By discretization, developed according to a possibly new approach, it is shown that the former model reduces to an equation error model, which is among the simplest used in adaptive control, and thus particularly easy to deal with in regard to parameter estimation and the definition of the optimum control rule.  相似文献   

20.
The aim of this paper is to show the flexibility and capacity of penalized spline smoothing as estimation routine for modelling duration time data. We analyse the unemployment behaviour in Germany between 2000 and 2004 using a massive database from the German Federal Employment Agency. To investigate dynamic covariate effects and differences between competing job markets depending on the distance between former and recent working place, a functional duration time model with competing risks is used. It is build upon a competing hazard function where some of the smooth covariate effects are allowed to vary with unemployment duration. The focus of our analysis is on contrasting the spatial, economic and individual covariate effects of the competing job markets and on analysing their general influence on the unemployed's re-employment probabilities. As a result of our analyses, we reveal differences concerning gender, age and education. We also discover an effect between the newly formed and the old West German states. Moreover, the spatial pattern between the considered job markets differs.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号