共查询到20条相似文献,搜索用时 15 毫秒
1.
Let ( X , Y ) be a random vector, where Y denotes the variable of interest possibly subject to random right censoring, and X is a covariate. We construct confidence intervals and bands for the conditional survival and quantile function of Y given X using a non-parametric likelihood ratio approach. This approach was introduced by Thomas & Grunkemeier (1975 ), who estimated confidence intervals of survival probabilities based on right censored data. The method is appealing for several reasons: it always produces intervals inside [0, 1], it does not involve variance estimation, and can produce asymmetric intervals. Asymptotic results for the confidence intervals and bands are obtained, as well as simulation results, in which the performance of the likelihood ratio intervals and bands is compared with that of the normal approximation method. We also propose a bandwidth selection procedure based on the bootstrap and apply the technique on a real data set. 相似文献
2.
Stute (1993, Consistent estimation under random censorship when covariables are present. Journal of Multivariate Analysis 45, 89–103) proposed a new method to estimate regression models with a censored response variable using least squares and showed the consistency and asymptotic normality for his estimator. This article proposes a new bootstrap-based methodology that improves the performance of the asymptotic interval estimation for the small sample size case. Therefore, we compare the behavior of Stute's asymptotic confidence interval with that of several confidence intervals that are based on resampling bootstrap techniques. In order to build these confidence intervals, we propose a new bootstrap resampling method that has been adapted for the case of censored regression models. We use simulations to study the improvement the performance of the proposed bootstrap-based confidence intervals show when compared to the asymptotic proposal. Simulation results indicate that, for the new proposals, coverage percentages are closer to the nominal values and, in addition, intervals are narrower. 相似文献
3.
Singly and Doubly Censored Current Status Data: Estimation, Asymptotics and Regression 总被引:1,自引:0,他引:1
Mark J. van der Laan Peter J. Bickel & Nicholas P. Jewell 《Scandinavian Journal of Statistics》1997,24(3):289-307
In biostatistical applications interest often focuses on the estimation of the distribution of time between two consecutive events. If the initial event time is observed and the subsequent event time is only known to be larger or smaller than an observed point in time, then the data is described by the well-understood singly censored current status model, also known as interval censored data, case I. Jewell et al. (1994) extended this current status model by allowing the initial time to be unobserved, with its distribution over an observed interval [A, B] known; the data is referred to as doubly censored current status data. This model has applications in AIDS partner studies. If the initial time is known to be uniformly distribute d, the model reduces to a submodel of the current status model with the same asymptotic information bounds as in the current status model, but the distribution of interest is essentially the derivative of the distribution of interest in the current status model. As a consequence the non-parametric maximum likelihood estimator is inconsistent. Moreover, this submodel contains only smooth heavy tailed distributions for which no moments exist. In this paper, we discuss the connection between the singly censored current status model and the doubly censored current status model (for the uniform initial time) in detail and explain the difficulties in estimation which arise in the doubly censored case. We propose a regularized MLE corresponding with the current status model. We prove rate results, efficiency of smooth functionals of the regularized MLE, and present a generally applicable efficient method for estimation of regression parameters, which does not rely on the existence of moments. We also discuss extending these ideas to a non-uniform distribution for the initial time. 相似文献
4.
The main purpose of this paper is to introduce first a new family of empirical test statistics for testing a simple null hypothesis when the vector of parameters of interest is defined through a specific set of unbiased estimating functions. This family of test statistics is based on a distance between two probability vectors, with the first probability vector obtained by maximizing the empirical likelihood (EL) on the vector of parameters, and the second vector defined from the fixed vector of parameters under the simple null hypothesis. The distance considered for this purpose is the phi-divergence measure. The asymptotic distribution is then derived for this family of test statistics. The proposed methodology is illustrated through the well-known data of Newcomb's measurements on the passage time for light. A simulation study is carried out to compare its performance with that of the EL ratio test when confidence intervals are constructed based on the respective statistics for small sample sizes. The results suggest that the ‘empirical modified likelihood ratio test statistic’ provides a competitive alternative to the EL ratio test statistic, and is also more robust than the EL ratio test statistic in the presence of contamination in the data. Finally, we propose empirical phi-divergence test statistics for testing a composite null hypothesis and present some asymptotic as well as simulation results for evaluating the performance of these test procedures. 相似文献
5.
LIUGEN XUE 《Scandinavian Journal of Statistics》2009,36(4):671-685
Abstract. A kernel regression imputation method for missing response data is developed. A class of bias-corrected empirical log-likelihood ratios for the response mean is defined. It is shown that any member of our class of ratios is asymptotically chi-squared, and the corresponding empirical likelihood confidence interval for the response mean is constructed. Our ratios share some of the desired features of the existing methods: they are self-scale invariant and no plug-in estimators for the adjustment factor and asymptotic variance are needed; when estimating the non-parametric function in the model, undersmoothing to ensure root- n consistency of the estimator for the parameter is avoided. Since the range of bandwidths contains the optimal bandwidth for estimating the regression function, the existing data-driven algorithm is valid for selecting an optimal bandwidth. We also study the normal approximation-based method. A simulation study is undertaken to compare the empirical likelihood with the normal approximation method in terms of coverage accuracies and average lengths of confidence intervals. 相似文献
6.
In this article, we discuss the construction of the confidence intervals for distribution functions under negatively associated samples. It is shown that the blockwise empirical likelihood (EL) ratio statistic for a distribution function is asymptotically χ2-type distributed. The result is used to obtain an EL-based confidence interval for the distribution function. 相似文献
7.
Pao-sheng Shen 《统计学通讯:模拟与计算》2013,42(4):531-543
Double censoring arises when T represents an outcome variable that can only be accurately measured within a certain range, [L, U], where L and U are the left- and right-censoring variables, respectively. When L is always observed, we consider the empirical likelihood inference for linear transformation models, based on the martingale-type estimating equation proposed by Chen et al. (2002). It is demonstrated that both the approach of Lu and Liang (2006) and that of Yu et al. (2011) can be extended to doubly censored data. Simulation studies are conducted to investigate the performance of the empirical likelihood ratio methods. 相似文献
8.
It is well-known that the nonparametric maximum likelihood estimator (NPMLE) may severely under-estimate the survival function with left truncated data. Based on the Nelson estimator (for right censored data) and self-consistency we suggest a nonparametric estimator of the survival function, the iterative Nelson estimator (INE), for arbitrarily truncated and censored data, where only few nonparametric estimators are available. By simulation we show that the INE does well in overcoming the under-estimation of the survival function from the NPMLE for left-truncated and interval-censored data. An interesting application of the INE is as a diagnostic tool for other estimators, such as the monotone MLE or parametric MLEs. The methodology is illustrated by application to two real world problems: the Channing House and the Massachusetts Health Care Panel Study data sets. 相似文献
9.
Suppose that we have a nonparametric regression model Y = m(X) + ε with X ∈ Rp, where X is a random design variable and is observed completely, and Y is the response variable and some Y-values are missing at random. Based on the “complete” data sets for Y after nonaprametric regression imputation and inverse probability weighted imputation, two estimators of the regression function m(x0) for fixed x0 ∈ Rp are proposed. Asymptotic normality of two estimators is established, which is used to construct normal approximation-based confidence intervals for m(x0). We also construct an empirical likelihood (EL) statistic for m(x0) with limiting distribution of χ21, which is used to construct an EL confidence interval for m(x0). 相似文献
10.
In this article, several methods to make inferences about the parameters of a finite mixture of distributions in the context of centrally censored data with partial identification are revised. These methods are an adaptation of the work in Contreras-Cristán, Gutiérrez-Peña, and O'Reilly (2003) in the case of right censoring. The first method focuses on an asymptotic approximation to a suitably simplified likelihood using some latent quantities; the second method is based on the expectation-maximization (EM) algorithm. Both methods make explicit use of latent variables and provide computationally efficient procedures compared to non-Bayesian methods that deal directly with the full likelihood of the mixture appealing to its asymptotic approximation. The third method, from a Bayesian perspective, uses data augmentation to work with an uncensored sample. This last method is related to a recently proposed Bayesian method in Baker, Mengersen, and Davis (2005). Our proposal of the three adapted methods is shown to provide similar inferential answers, thus offering alternative analyses. 相似文献
11.
Recently, least absolute deviations (LAD) estimator for median regression models with doubly censored data was proposed and the asymptotic normality of the estimator was established. However, it is invalid to make inference on the regression parameter vectors, because the asymptotic covariance matrices are difficult to estimate reliably since they involve conditional densities of error terms. In this article, three methods, which are based on bootstrap, random weighting, and empirical likelihood, respectively, and do not require density estimation, are proposed for making inference for the doubly censored median regression models. Simulations are also done to assess the performance of the proposed methods. 相似文献
12.
We discuss the maximum likelihood estimates (MLEs) of the parameters of the log-gamma distribution based on progressively Type-II censored samples. We use the profile likelihood approach to tackle the problem of the estimation of the shape parameter κ. We derive approximate maximum likelihood estimators of the parameters μ and σ and use them as initial values in the determination of the MLEs through the Newton–Raphson method. Next, we discuss the EM algorithm and propose a modified EM algorithm for the determination of the MLEs. A simulation study is conducted to evaluate the bias and mean square error of these estimators and examine their behavior as the progressive censoring scheme and the shape parameter vary. We also discuss the interval estimation of the parameters μ and σ and show that the intervals based on the asymptotic normality of MLEs have very poor probability coverages for small values of m. Finally, we present two examples to illustrate all the methods of inference discussed in this paper. 相似文献
13.
Singh B 《Lifetime data analysis》2002,8(1):69-88
Inferences concerning exponential distributions are considered from a sampling theory viewpoint when the data are randomly right censored and the censored values are missing. Both one-sample and m-sample (m 2) problems are considered. Likelihood functions are obtained for situations in which the censoring mechanism is informative which leads to natural and intuitively appealing estimators of the unknown proportions of censored observations. For testing hypotheses about the unknown parameters, three well-known test statistics, namely, likelihood ratio test, score test, and Wald-type test are considered. 相似文献
14.
《统计学通讯:理论与方法》2013,42(4):749-774
Abstract In this article two methods are proposed to make inferences about the parameters of a finite mixture of distributions in the context of partially identifiable censored data. The first method focuses on a mixture of location and scale models and relies on an asymptotic approximation to a suitably constructed augmented likelihood; the second method provides a full Bayesian analysis of the mixture based on a Gibbs sampler. Both methods make explicit use of latent variables and provide computationally efficient procedures compared to other methods which deal directly with the likelihood of the mixture. This may be crucial if the number of components in the mixture is not small. Our proposals are illustrated on a classical example on failure times for communication devices first studied by Mendenhall and Hader (Mendenhall, W., Hader, R. J. (1958). Estimation of parameters of mixed exponentially distributed failure time distributions from censored life test data. Biometrika 45:504–520.). In addition, we study the coverage of the confidence intervals obtained from each of the methods by means of a small simulation exercise. 相似文献
15.
In some applications, the failure time of interest is the time from an originating event to a failure event while both event times are interval censored. We propose fitting Cox proportional hazards models to this type of data using a spline‐based sieve maximum marginal likelihood, where the time to the originating event is integrated out in the empirical likelihood function of the failure time of interest. This greatly reduces the complexity of the objective function compared with the fully semiparametric likelihood. The dependence of the time of interest on time to the originating event is induced by including the latter as a covariate in the proportional hazards model for the failure time of interest. The use of splines results in a higher rate of convergence of the estimator of the baseline hazard function compared with the usual non‐parametric estimator. The computation of the estimator is facilitated by a multiple imputation approach. Asymptotic theory is established and a simulation study is conducted to assess its finite sample performance. It is also applied to analyzing a real data set on AIDS incubation time. 相似文献
16.
We show that under reasonable conditions the nonparametric maximum likelihood estimate (NPMLE) of the distribution function from left-truncated and case 1 interval-censored data is inconsistent, in contrast to the consistency properties of the NPMLE from only left-truncated data or only interval-censored data. However, the conditional NPMLE is shown to be consistent. Numerical examples are provided to illustrate their finite sample properties. 相似文献
17.
Binhuan Wang 《统计学通讯:理论与方法》2014,43(15):3248-3268
In a continuous-scale diagnostic test, the receiver operating characteristic (ROC) curve is useful to evaluate the range of the sensitivity at the cut-off point that yields a desired specificity. Many current studies on inference of the ROC curve focus on the complete data case. In this paper, an imputation-based profile empirical likelihood ratio for the sensitivity, which is free of bandwidth selection, is defined and shown to follow an asymptotically scaled Chi-square distribution. Two new confidence intervals are proposed for the sensitivity with missing data. Simulation studies are conducted to evaluate the finite sample performance of the proposed intervals in terms of coverage probability. A real example is used to illustrate the new methods. 相似文献
18.
This paper examines modeling and inference questions for experiments in which different subsets of a set of k possibly dependent components are tested in r different environments. In each environment, the failure times of the set of components on test is assumed to be governed by a particular type of multivariate exponential (MVE) distribution. For any given component tested in several environments, it is assumed that its marginal failure rate varies from one environment to another via a change of scale between the environments, resulting in a joint MVE model which links in a natural way the applicable MVE distributions describing component behavior in each fixed environment. This study thus extends the work of Proschan and Sullo (1976) to multiple environments and the work of Kvam and Samaniego (1993) to dependent data. The problem of estimating model parameters via the method of maximum likelihood is examined in detail. First, necessary and sufficient conditions for the identifiability of model parameters are established. We then treat the derivation of the MLE via a numerically-augmented application of the EM algorithm. The feasibility of the estimation method is demonstrated in an example in which the likelihood ratio test of the hypothesis of equal component failure rates within any given environment is carried out. 相似文献
19.
Birdal Şenoğlu 《Journal of applied statistics》2007,34(2):141-151
Estimators of parameters are derived by using the method of modified maximum likelihood (MML) estimation when the distribution of covariate X and the error e are both non-normal in a simple analysis of covariance (ANCOVA) model. We show that our estimators are efficient. We also develop a test statistic for testing a linear contrast and show that it is robust. We give a real life example. 相似文献
20.
The maximum likelihood estimates (MLEs) of the parameters of a two-parameter lognormal distribution with left truncation and right censoring are developed through the Expectation Maximization (EM) algorithm. For comparative purpose, the MLEs are also obtained by the Newton–Raphson method. The asymptotic variance-covariance matrix of the MLEs is obtained by using the missing information principle, under the EM framework. Then, using asymptotic normality of the MLEs, asymptotic confidence intervals for the parameters are constructed. Asymptotic confidence intervals are also obtained using the estimated variance of the MLEs by the observed information matrix, and by using parametric bootstrap technique. Different confidence intervals are then compared in terms of coverage probabilities, through a Monte Carlo simulation study. A prediction problem concerning the future lifetime of a right censored unit is also considered. A numerical example is given to illustrate all the inferential methods developed here. 相似文献