首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Bayesian and likelihood approaches to on-line detecting change points in time series are discussed and applied to analyze biomedical data. Using a linear dynamic model, the Bayesian analysis outputs the conditional posterior probability of a change at time t ? 1, given the data up to time t and the status of changes occurred before time t ? 1. The likelihood method is based on a change-point regression model and tests whether there is no change-point.  相似文献   

2.
Often the dependence in multivariate survival data is modeled through an individual level effect called the frailty. Due to its mathematical simplicity, the gamma distribution is often used as the frailty distribution for hazard modeling. However, it is well known that the gamma frailty distribution has many drawbacks. For example, it weakens the effect of covariates. In addition, in the presence of a multilevel model, overall frailty comes from several levels. To overcome such drawbacks, more heavy-tailed distributions are needed to model the frailty distribution in order to incorporate extra variability. In this article, we develop a class of log-skew-t distributions for the frailty. This class includes the log-normal distribution along with many other heavy tailed distributions, e.g., log-Cauchy, log normal, and log-t as special cases.

Conditional on the frailty, the survival times are assumed to be independent with proportional hazard structure. The modeling process is then completed by assuming multilevel frailty-effects. Instead of tuning a strict parameterization of the baseline hazard function, we consider the partial likelihood approach and thus leave the baseline function unspecified. By eliminating the hazard, the pre-specification and computation are simplified considerably.  相似文献   

3.
The hazard function plays an important role in survival analysis and reliability, since it quantifies the instantaneous failure rate of an individual at a given time point t, given that this individual has not failed before t. In some applications, abrupt changes in the hazard function are observed, and it is of interest to detect the location of such a change. In this paper, we consider testing of existence of a change in the parameters of an exponential regression model, based on a sample of right-censored survival times and the corresponding covariates. Likelihood ratio type tests are proposed and non-asymptotic bounds for the type II error probability are obtained. When the tests lead to acceptance of a change, estimators for the location of the change are proposed. Non-asymptotic upper bounds of the underestimation and overestimation probabilities are obtained. A short simulation study illustrates these results.  相似文献   

4.
5.
Previous research on prostate cancer survival trends in the United States National Cancer Institute's Surveillance Epidemiology and End Results database has indicated a potential change-point in the age of diagnosis of prostate cancer around age 50. Identifying a change-point value in prostate cancer survival and cure could have important policy and health care management implications. Statistical analysis of this data has to address two complicating features: (1) change-point models are not smooth functions and so present computational and theoretical difficulties; and (2) models for prostate cancer survival need to account for the fact that many men diagnosed with prostate cancer can be effectively cured of their disease with early treatment. We develop a cure survival model that allows for change-point effects in covariates to investigate a potential change-point in the age of diagnosis of prostate cancer. Our results do not indicate that age under 50 is associated with increased hazard of death from prostate cancer.  相似文献   

6.
ABSTRACT

We propose a semiparametric approach to estimate the existence and location of a statistical change-point to a nonlinear multivariate time series contaminated with an additive noise component. In particular, we consider a p-dimensional stochastic process of independent multivariate normal observations where the mean function varies smoothly except at a single change-point. Our approach involves conducting a Bayesian analysis on the empirical detail coefficients of the original time series after a wavelet transform. If the mean function of our time series can be expressed as a multivariate step function, we find our Bayesian-wavelet method performs comparably with classical parametric methods such as maximum likelihood estimation. The advantage of our multivariate change-point method is seen in how it applies to a much larger class of mean functions that require only general smoothness conditions.  相似文献   

7.
In the development of many diseases there are often associated variables which continuously measure the progress of an individual towards the final expression of the disease (failure). Such variables are stochastic processes, here called marker processes, and, at a given point in time, they may provide information about the current hazard and subsequently on the remaining time to failure. Here we consider a simple additive model for the relationship between the hazard function at time t and the history of the marker process up until time t. We develop some basic calculations based on this model. Interest is focused on statistical applications for markers related to estimation of the survival distribution of time to failure, including (i) the use of markers as surrogate responses for failure with censored data, and (ii) the use of markers as predictors of the time elapsed since onset of a survival process in prevalent individuals. Particular attention is directed to potential gains in efficiency incurred by using marker process information.  相似文献   

8.
Suppose the probability model for failure time data, subject to censoring, is specified by the hazard function λ(t)exp(βT x), where x is a vector of covariates. Analytical difficulties involved in finding the optimal design are avoided by assuming that λ is completely specified and by using D-optimality based on the information matrix for β Optimal designs are found to depend on β, but some results of practical consequence are obtained. It is found that censoring does not affect the choice of design appreciably when βT x ≥ 0 for all points of the feasible region, but may have an appreciable effect when βixi 0, for all i and all points in the feasible experimental region. The nature of the effect is discussed in detail for the cases of one and two parameters. It is argued that in practical biomedical situations the optimal design is almost always the same as for uncensored data.  相似文献   

9.
In this article, we consider a new regression model for counting processes under a proportional hazards assumption. This model is motivated by the need of understanding the evolution of the booking process of a railway company. The main novelty of the approach consists in assuming that the baseline hazard function is piecewise constant, with unknown times of jump (these times of jump are estimated from the data as model parameters). Hence, the parameters of the model can be separated into two different types: parameters that measure the influence of the covariates, and parameters from a multiple change-point model for the baseline. Cox??s semiparametric regression can be seen as a limit case of our model. We develop an iterative procedure to estimate the different parameters, and a test procedure that allows to perform change-point detection in the baseline. Our technique is supported by simulation studies and a real data analysis, which show that our model can be a reasonable alternative to Cox??s regression model, particularly in the presence of tied event times.  相似文献   

10.
For the time-to-event outcome, current methods for sample size determination are based on the proportional hazard model. However, if the proportionality assumption fails to capture the relationship between the hazard time and covariates, the proportional hazard model is not suitable to analyze survival data. The accelerated failure time model is an alternative method to deal with survival data. In this article, we address the issue that the relationship between the hazard time and the treatment effect is satisfied with the accelerated failure time model to design a multi-regional trial for a phase III clinical trial. The log-rank test is employed to deal with the heterogeneous effect size among regions. The test statistic for the overall treatment effect is used to determine the total sample size for a multi-regional trial and the consistent trend is used to rationalize partition sample size to each region.  相似文献   

11.
The common choices of frailty distribution in lifetime data models include the Gamma and Inverse Gaussian distributions. We present diagnostic plots for these distributions when frailty operates in a proportional hazards framework. Firstly, we present plots based on the form of the unconditional survival function when the baseline hazard is assumed to be Weibull. Secondly, we base a plot on a closure property that applies for any baseline hazard, namely, that the frailty distribution among survivors at time t has the same form as the original distribution, with the same shape parameter but different scale parameter. We estimate the shape parameter at different values of t and examine whether it is constant, that is, whether plotted values form a straight line parallel to the time axis. We provide simulation results assuming Weibull baseline hazard and an example to illustrate the methods.  相似文献   

12.
This article extends a random preventive maintenance scheme, called repair alert model, when there exist environmental variables that effect on system lifetimes. It can be used for implementing age-dependent maintenance policies on engineering devices. In other words, consider a device that works for a job and is subject to failure at a random time X, and the maintenance crew can avoid the failure by a possible replacement at some random time Z. The new model is flexible to including covariates with both fixed and random effects. The problem of estimating parameters is also investigated in details. Here, the observations are in the form of random signs censoring data (RSCD) with covariates. Therefore, this article generalizes derived statistical inferences on the basis of RSCD albeit without covariates in past literature. To do this, it is assumed that the system lifetime distribution belongs to the log-location-scale family of distributions. A real dataset is also analyzed on basis of the results obtained.  相似文献   

13.
A regression model with a possible structural change and with a small number of measurements is considered. A priori information about the shape of the regression function is used to formulate the model as a linear regression model with inequality constraints and a likelihood ratio test for the presence of a change-point is constructed. The exact null distribution of the test statistic is given. Consistency of the test is proved when the noise level goes to zero. Numerical approximations to the powers against various alternatives are given and compared with the powers of the k-linear-r-ahead recursive residuals tests and CUSUM tests. Performance of four different estimators of the change-point is studied in a Monte Carlo experiment. An application of the procedures to some real data is also presented.  相似文献   

14.
Abstract

Satten et al. [Satten, G. A., Datta, S., Robins, J. M. (2001). Estimating the marginal survival function in the presence of time dependent covariates. Statis. Prob. Lett. 54: 397--403] proposed an estimator [denoted by ?(t)] of survival function of failure times that is in the class of survival function estimators proposed by Robins [Robins, J. M. (1993). Information recovery and bias adjustment in proportional hazards regression analysis of randomized trials using surrogate markers. In: Proceedings of the American Statistical Association-Biopharmaceutical Section. Alexandria, VA: ASA, pp. 24--33]. The estimator is appropriate when data are subject to dependent censoring. In this article, it is demonstrated that the estimator ?(t) can be extended to estimate the survival function when data are subject to dependent censoring and left truncation. In addition, we propose an alternative estimator of survival function [denoted by ? w (t)] that is represented as an inverse-probability-weighted average Satten and Datta [Satten, G. A., Datta, S. (2001). The Kaplan–Meier estimator as an inverse-probability-of-censoring weighted average. Amer. Statist. Ass. 55: 207--210]. Simulation results show that when truncation is not severe the mean squared error of ?(t) is smaller than that of ? w (t), except for the case when censoring is light. However, when truncation is severe, ? w (t) has the advantage of less bias and the situation can be reversed.  相似文献   

15.
In this article, we develop a Bayesian variable selection method that concerns selection of covariates in the Poisson change-point regression model with both discrete and continuous candidate covariates. Ranging from a null model with no selected covariates to a full model including all covariates, the Bayesian variable selection method searches the entire model space, estimates posterior inclusion probabilities of covariates, and obtains model averaged estimates on coefficients to covariates, while simultaneously estimating a time-varying baseline rate due to change-points. For posterior computation, the Metropolis-Hastings within partially collapsed Gibbs sampler is developed to efficiently fit the Poisson change-point regression model with variable selection. We illustrate the proposed method using simulated and real datasets.  相似文献   

16.
Abstract

In some clinical, environmental, or economical studies, researchers are interested in a semi-continuous outcome variable which takes the value zero with a discrete probability and has a continuous distribution for the non-zero values. Due to the measuring mechanism, it is not always possible to fully observe some outcomes, and only an upper bound is recorded. We call this left-censored data and observe only the maximum of the outcome and an independent censoring variable, together with an indicator. In this article, we introduce a mixture semi-parametric regression model. We consider a parametric model to investigate the influence of covariates on the discrete probability of the value zero. For the non-zero part of the outcome, a semi-parametric Cox’s regression model is used to study the conditional hazard function. The different parameters in this mixture model are estimated using a likelihood method. Hereby the infinite dimensional baseline hazard function is estimated by a step function. As results, we show the identifiability and the consistency of the estimators for the different parameters in the model. We study the finite sample behaviour of the estimators through a simulation study and illustrate this model on a practical data example.  相似文献   

17.
We propose a test for state dependence in binary panel data with individual covariates. For this aim, we rely on a quadratic exponential model in which the association between the response variables is accounted for in a different way with respect to more standard formulations. The level of association is measured by a single parameter that may be estimated by a Conditional Maximum Likelihood (CML) approach. Under the dynamic logit model, the conditional estimator of this parameter converges to zero when the hypothesis of absence of state dependence is true. Therefore, it is possible to implement a t-test for this hypothesis which may be very simply performed and attains the nominal significance level under several structures of the individual covariates. Through an extensive simulation study, we find that our test has good finite sample properties and it is more robust to the presence of (autocorrelated) covariates in the model specification in comparison with other existing testing procedures for state dependence. The proposed approach is illustrated by two empirical applications: the first is based on data coming from the Panel Study of Income Dynamics and concerns employment and fertility; the second is based on the Health and Retirement Study and concerns the self reported health status.  相似文献   

18.
In the study of the stochastic behaviour of the lifetime of an element as a function of its length, it is often observed that the failure time (or lifetime) decreases as the length increases. In probabilistic terms, such an idea can be expressed as follows. Let T be the lifetime of a specimen of length x, so the survival function, which denotes the probability that an element of length x survives till time t, will be given by ST (t, x) = P(T > t/α(x), where α(x) is a monotonically decreasing function. In particular, it is often assumed that T has a Weibull distribution. In this paper, we propose a generalization of this Weibull model by assuming that the distribution of T is Generalized gamma (GG). Since the GG model contains the Weibull, Gamma and Lognormal models as special and limiting cases, a GG regression model is an appropriate tool for describing the size effect on the lifetime and for selecting among the embedded models. Maximum likelihood estimates are obtained for the GG regression model with α(x) = cxb . As a special case this provide an alternative to the usual approach to estimation for the GG distribution which involves reparametrization. Related parametric inference issues are addressed and illustrated using two experimental data sets. Some discussion of censored data is also provided.  相似文献   

19.
Graphical representation of survival curves is often used to illustrate associations between exposures and time-to-event outcomes. However, when exposures are time-dependent, calculation of survival probabilities is not straightforward. Our aim was to develop a method to estimate time-dependent survival probabilities and represent them graphically. Cox models with time-dependent indicators to represent state changes were fitted, and survival probabilities were plotted using pre-specified times of state changes. Time-varying hazard ratios for the state change were also explored. The method was applied to data from the Adult-to-Adult Living Donor Liver Transplantation Cohort Study (A2ALL). Survival curves showing a ‘split’ at a pre-specified time t allow for the qualitative comparison of survival probabilities between patients with similar baseline covariates who do and do not experience a state change at time t. Time since state change interactions can be visually represented to reflect changing hazard ratios over time. A2ALL study results showed differences in survival probabilities among those who did not receive a transplant, received a living donor transplant, and received a deceased donor transplant. These graphical representations of survival curves with time-dependent indicators improve upon previous methods and allow for clinically meaningful interpretation.  相似文献   

20.
Imperfect repair models are a class of stochastic models that deal with recurrent phenomena. This article focuses on the Block, Borges, and Savits (1985) age-dependent minimal repair model (the BBS model) in which a system that fails at time t undergoes one of two types of repair: with probability p(t), a perfect repair is performed, or with probability 1-p(t), a minimal repair is performed. The goodness-of-fit problem of interest concerns the initial distribution of the failure ages. In particular, interest is on testing the null hypothesis that the hazard rate function of the time-to-first-event-occurrence, λ(·), is equal to a prespecified hazard rate function λ0(·). This paper extends the class of hazard-based smooth goodness-of-fit tests introduced in Peña (1998a) to the case where data accrual is from a BBS model. The goodness-of-fit tests are score tests derived by reformulating Neyman's idea of smooth tests in terms of hazard functions. Omnibus as well as directional tests are developed and simulation results are presented to illustrate the sensitivities of the proposed tests for certain types of alternatives.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号