首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In life testing and survival analyses which involve the use of expensive equipment the cost of continuing an experiment until all the items on test have failed can be quite high. In these situations it is reasonable to make a statistical test when a pre-specified percentile, e.g. median of the control group has been observed. This article adapts some existing procedures for complete samples to randomly censored data. The results of Lo and Singh (1985) who extended the Bahadur representation of quantiles to the censored case enable us to use the methods of Gastwirth (1968) and Hettmansperger (1973) which were based on Bahadur's result to extend the procedures of Mathisen (1943), Gart (1963) and Slivka (1970).The large sample efficiency of the control median test is the same as that of Brookmeyer and Crowley's (1982) extension of the usual median test. For the two-sample shift problem with observations following the double-exponential law, the median remains the optimum percentile to use until the censoring becomes quite heavy. On the other hand, in the two-sample scale parameter problem for data from an exponential distribution the percentile (80th in the uncensored case) yielding the asymptotically most powerful test in the family of control percentile tests no longer is optimum. The effect becomes noticeable when 25% or more of the data is censored.  相似文献   

2.
A proof is provided to show that Gehan's 1965 generalization of the two sample Wilcoxon test lies outside the class of efficient score procedures for right censored data (Prentice 1978).  相似文献   

3.
In this article, we analyze interval censored failure time data with competing risks. A new estimator for the cumulative incidence function is derived using an approximate likelihood and a test statistic to compare two samples is then obtained by extending Sun's test statistic. Small sample properties of the proposed methods are examined by conducting simulations and a cohort dataset from AIDS patients is analyzed as a real example.  相似文献   

4.
In this article, we propose a new goodness-of-fit test for Type I or Type II censored samples from a completely specified distribution. This test is a generalization of Michael's test for censored data, which is based on the empirical distribution and a variance stabilizing transformation. Using Monte Carlo methods, the distributions of the test statistics are analyzed under the null hypothesis. Tables of quantiles of these statistics are also provided. The power of the proposed test is studied and compared to that of other well-known tests also using simulation. The proposed test is more powerful in most of the considered cases. Acceptance regions for the PP, QQ, and Michael's stabilized probability plots are derived, which enable one to visualize which data contribute to the decision of rejecting the null hypothesis. Finally, an application in quality control is presented as illustration.  相似文献   

5.
This article analyzes a small censored data set to demonstrate the potential dangers of using statistical computing packages without understanding the details of statistical methods. The data, consisting of censored response times with heavy ties in one time point, were analyzed with a Cox regression model utilizing SAS PHREG and BMDP2L procedures. The p values, reported from both SAS PHREG and BMDP2L procedures, for testing the equality of two treatments vary considerably. This article illustrates that (1) the Breslow likelihood used in both BMDP2L and SAS PHREG procedures is too conservative and can have a critical effect on an extreme data set, (2) Wald's test in the SAS PHREG procedure may yield absurd results from most likelihood models, and (3) BMDP2L needs to include more than just the Breslow likelihood in future development.  相似文献   

6.
It is known that the maximum likelihood methods does not provide explicit estimators for the mean and standard deviation of the normal distribution based on Type II censored samples. In this paper we present a simple method of deriving explicit estimators by approximating the likelihood equations appropriately. We obtain the variances and covariance of these estimators. We also show that these estimators are almost as eficient as the maximum likelihood (ML) estimators and just as eficient as the best linear unbiased (BLU), and the modified maximum likelihood (MML) estimators. Finally, we illustrate this method of estimation by applying it to Gupta's and Darwin's data.  相似文献   

7.
In this paper, we consider some problems of estimation and reconstruction based on middle censored competing risks data. It is assumed that the lifetime distributions of the latent failure times are independent and exponential distributed with different parameters and also that the censoring mechanism is independent. The maximum likelihood estimators (MLEs) of the unknown parameters are obtained. We then use the asymptotic distribution of the MLEs to construct approximate confidence intervals. Based on gamma priors, Lindley's approximation method is applied to obtain the Bayesian estimates of the unknown parameters under squared error loss function. Since it is not possible to construct the credible intervals, we propose and implement the Gibbs sampling technique to construct the credible intervals. Several point reconstructors for failure time of censored units are provided. Finally, a simulation study is given by Monte-Carlo simulations to evaluate the performances of the different methods and a data set is analysed to illustrate the proposed procedures.  相似文献   

8.
We propose a new goodness-of-fit test for normal and lognormal distributions with unknown parameters and type-II censored data. This test is a generalization of Michael's test for censored samples, which is based on the empirical distribution and a variance stabilizing transformation. We estimate the parameters of the model by using maximum likelihood and Gupta's methods. The quantiles of the distribution of the test statistic under the null hypothesis are obtained through Monte Carlo simulations. The power of the proposed test is estimated and compared to that of the Kolmogorov–Smirnov test also using simulations. The new test is more powerful than the Kolmogorov–Smirnov test in most of the studied cases. Acceptance regions for the PP, QQ and Michael's stabilized probability plots are derived, making it possible to visualize which data contribute to the decision of rejecting the null hypothesis. Finally, an illustrative example is presented.  相似文献   

9.
In obstetrics and gynecology, knowledge about how women''s features are associated with childbirth is important. This leads to establishing guidelines and can help managers to describe the dynamics of pregnant women''s hospital stays. Then, time is a variable of great importance and can be described by survival models. An issue that should be considered in the modeling is the inclusion of women for whom the duration of labor cannot be observed due to fetal death, generating a proportion of times equal to zero. Additionally, another proportion of women''s time may be censored due to some intervention. The aim of this paper was to present the Log-Normal zero-inflated cure regression model and to evaluate likelihood-based parameter estimation by a simulation study. In general, the inference procedures showed a better performance for larger samples and low proportions of zero inflation and cure. To exemplify how this model can be an important tool for investigating the course of the childbirth process, we considered the Better Outcomes in Labor Difficulty project dataset and showed that parity and educational level are associated with the main outcomes. We acknowledge the World Health Organization for granting us permission to use the dataset.  相似文献   

10.
We develop a simple approach to finding the Fisher information matrix (FIM) for a single pair of order statistic and its concomitant, and Type II right, left, and doubly censored samples from an arbitrary bivariate distribution. We use it to determine explicit expressions for the FIM for the three parameters of Downton's bivariate exponential distribution for single pairs and Type II censored samples. We evaluate the FIM in censored samples for finite sample sizes and determine its limiting form as the sample size increases. We discuss implications of our findings to inference and experimental design using small and large censored samples and for ranked-set samples from this distribution.  相似文献   

11.
In this article, we develop a formal goodness-of-fit testing procedure for one-shot device testing data, in which each observation in the sample is either left censored or right censored. Such data are also called current status data. We provide an algorithm for calculating the nonparametric maximum likelihood estimate (NPMLE) of the unknown lifetime distribution based on such data. Then, we consider four different test statistics that can be used for testing the goodness-of-fit of accelerated failure time (AFT) model by the use of samples of residuals: a chi-square-type statistic based on the difference between the empirical and expected numbers of failures at each inspection time; two other statistics based on the difference between the NPMLE of the lifetime distribution obtained from one-shot device testing data and the distribution specified under the null hypothesis; as a final statistic, we use White's idea of comparing two estimators of the Fisher Information (FI) to propose a test statistic. We then compare these tests in terms of power, and draw some conclusions. Finally, we present an example to illustrate the proposed tests.  相似文献   

12.
We propose a new procedure for combining multiple tests in samples of right-censored observations. The new method is based on multiple constrained censored empirical likelihood where the constraints are formulated as linear functionals of the cumulative hazard functions. We prove a version of Wilks’ theorem for the multiple constrained censored empirical likelihood ratio, which provides a simple reference distribution for the test statistic of our proposed method. A useful application of the proposed method is, for example, examining the survival experience of different populations by combining different weighted log-rank tests. Real data examples are given using the log-rank and Gehan-Wilcoxon tests. In a simulation study of two sample survival data, we compare the proposed method of combining tests to previously developed procedures. The results demonstrate that, in addition to its computational simplicity, the combined test performs comparably to, and in some situations more reliably than previously developed procedures. Statistical software is available in the R package ‘emplik’.  相似文献   

13.
This paper compares methods of estimation for the parameters of a Pareto distribution of the first kind to determine which method provides the better estimates when the observations are censored, The unweighted least squares (LS) and the maximum likelihood estimates (MLE) are presented for both censored and uncensored data. The MLE's are obtained using two methods, In the first, called the ML method, it is shown that log-likelihood is maximized when the scale parameter is the minimum sample value. In the second method, called the modified ML (MML) method, the estimates are found by utilizing the maximum likelihood value of the shape parameter in terms of the scale parameter and the equation for the mean of the first order statistic as a function of both parameters. Since censored data often occur in applications, we study two types of censoring for their effects on the methods of estimation: Type II censoring and multiple random censoring. In this study we consider different sample sizes and several values of the true shape and scale parameters.

Comparisons are made in terms of bias and the mean squared error of the estimates. We propose that the LS method be generally preferred over the ML and MML methods for estimating the Pareto parameter γ for all sample sizes, all values of the parameter and for both complete and censored samples. In many cases, however, the ML estimates are comparable in their efficiency, so that either estimator can effectively be used. For estimating the parameter α, the LS method is also generally preferred for smaller values of the parameter (α ≤4). For the larger values of the parameter, and for censored samples, the MML method appears superior to the other methods with a slight advantage over the LS method. For larger values of the parameter α, for censored samples and all methods, underestimation can be a problem.  相似文献   

14.
In this article, we consider some problems of estimation and prediction when progressive Type-I interval censored competing risks data are from the proportional hazards family. The maximum likelihood estimators of the unknown parameters are obtained. Based on gamma priors, the Lindely's approximation and importance sampling methods are applied to obtain Bayesian estimators under squared error and linear–exponential loss functions. Several classical and Bayesian point predictors of censored units are provided. Also, based on given producer's and consumer's risks accepting sampling plans are considered. Finally, the simulation study is given by Monte Carlo simulations to evaluate the performances of the different methods.  相似文献   

15.
In this paper we address the problem of estimating a vector of regression parameters in the Weibull censored regression model. Our main objective is to provide natural adaptive estimators that significantly improve upon the classical procedures in the situation where some of the predictors may or may not be associated with the response. In the context of two competing Weibull censored regression models (full model and candidate submodel), we consider an adaptive shrinkage estimation strategy that shrinks the full model maximum likelihood estimate in the direction of the submodel maximum likelihood estimate. We develop the properties of these estimators using the notion of asymptotic distributional risk. The shrinkage estimators are shown to have higher efficiency than the classical estimators for a wide class of models. Further, we consider a LASSO type estimation strategy and compare the relative performance with the shrinkage estimators. Monte Carlo simulations reveal that when the true model is close to the candidate submodel, the shrinkage strategy performs better than the LASSO strategy when, and only when, there are many inactive predictors in the model. Shrinkage and LASSO strategies are applied to a real data set from Veteran's administration (VA) lung cancer study to illustrate the usefulness of the procedures in practice.  相似文献   

16.
The purpose of this paper is to present a semi-parametric estimation of a survival function when analyzing incomplete and doubly censored data. Under the assumption that the chance of censoring is not related to the individual's survivorship, we propose a consistent estimation of survival. The derived estimator treats the uncensored observations nonparametrically and uses parametric models for both right and left censored data. Some asymptotic properties and simulation studies are also presented in order to analyze the behavior of the proposed estimator.  相似文献   

17.
In this paper, we derive the maximum likelihood estimators of the parameters of a Laplace distribution based on general Type-II censored samples. The resulting explicit MLE's turn out to be simple linear functions of the order statistics. We then examine the asymptotic variance of the estimates by calculating the elements of the Fisher information matrix.  相似文献   

18.
Recently, progressively Type II censored samples have attracted attention in the study and analysis of life-testing data. Here we propose an indirect approach for computing the Fisher information (FI) in progressively Type II censored samples that simplifies the calculations. Some recurrence relations for the FI in progressively Type II censored samples are derived that facilitate the FI computation using the proposed decomposition. This paper presents a standard recurrence relation that simplifies computation of the FI in progressively Type II censored samples to a sum; FI in collections order statistics (OS). We compute the FI in a collections of progressively Type II censored samples for some known distributions.  相似文献   

19.
Nonparametric maximum likelihood estimation of bivariate survival probabilities is developed for interval censored survival data. We restrict our attention to the situation where response times within pairs are not distinguishable, and the univariate survival distribution is the same for any individual within any pair. Campbell's (1981) model is modified to incorporate this restriction. Existence and uniqueness of maximum likelihood estimators are discussed. This methodology is illustrated with a bivariate life table analysis of an angioplasty study where each patient undergoes two procedures.  相似文献   

20.
The currently existing estimation methods and goodness-of-fit tests for the Cox model mainly deal with right censored data, but they do not have direct extension to other complicated types of censored data, such as doubly censored data, interval censored data, partly interval-censored data, bivariate right censored data, etc. In this article, we apply the empirical likelihood approach to the Cox model with complete sample, derive the semiparametric maximum likelihood estimators (SPMLE) for the Cox regression parameter and the baseline distribution function, and establish the asymptotic consistency of the SPMLE. Via the functional plug-in method, these results are extended in a unified approach to doubly censored data, partly interval-censored data, and bivariate data under univariate or bivariate right censoring. For these types of censored data mentioned, the estimation procedures developed here naturally lead to Kolmogorov-Smirnov goodness-of-fit tests for the Cox model. Some simulation results are presented.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号