首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In this paper, we consider the maximum likelihood and Bayes estimation of the scale parameter of the half-logistic distribution based on a multiply type II censored sample. However, the maximum likelihood estimator(MLE) and Bayes estimator do not exist in an explicit form for the scale parameter. We consider a simple method of deriving an explicit estimator by approximating the likelihood function and discuss the asymptotic variances of MLE and approximate MLE. Also, an approximation based on the Laplace approximation (Tierney & Kadane, 1986) is used to obtain the Bayes estimator. In order to compare the MLE, approximate MLE and Bayes estimates of the scale parameter, Monte Carlo simulation is used.  相似文献   

2.
Non-central chi-squared distribution plays a vital role in statistical testing procedures. Estimation of the non-centrality parameter provides valuable information for the power calculation of the associated test. We are interested in the statistical inference property of the non-centrality parameter estimate based on one observation (usually a summary statistic) from a truncated chi-squared distribution. This work is motivated by the application of the flexible two-stage design in case–control studies, where the sample size needed for the second stage of a two-stage study can be determined adaptively by the results of the first stage. We first study the moment estimate for the truncated distribution and prove its existence, uniqueness, and inadmissibility and convergence properties. We then define a new class of estimates that includes the moment estimate as a special case. Among this class of estimates, we recommend to use one member that outperforms the moment estimate in a wide range of scenarios. We also present two methods for constructing confidence intervals. Simulation studies are conducted to evaluate the performance of the proposed point and interval estimates.  相似文献   

3.
It is shown that Strawderman's [1974. Minimax estimation of powers of the variance of a normal population under squared error loss. Ann. Statist. 2, 190–198] technique for estimating the variance of a normal distribution can be extended to estimating a general scale parameter in the presence of a nuisance parameter. Employing standard monotone likelihood ratio-type conditions, a new class of improved estimators for this scale parameter is derived under quadratic loss. By imposing an additional condition, a broader class of improved estimators is obtained. The dominating procedures are in form analogous to those in Strawderman [1974. Minimax estimation of powers of the variance of a normal population under squared error loss. Ann. Statist. 2, 190–198]. Application of the general results to the exponential distribution yields new sufficient conditions, other than those of Brewster and Zidek [1974. Improving on equivariant estimators. Ann. Statist. 2, 21–38] and Kubokawa [1994. A unified approach to improving equivariant estimators. Ann. Statist. 22, 290–299], for improving the best affine equivariant estimator of the scale parameter. A class of estimators satisfying the new conditions is constructed. The results shed new light on Strawderman's [1974. Minimax estimation of powers of the variance of a normal population under squared error loss. Ann. Statist. 2, 190–198] technique.  相似文献   

4.
In this paper, we make use of an algorithm of Huffer & Lin (2001) in order to develop exact prediction intervals for failure times from one-parameter and two- parameter exponential distributions based on doubly Type-II censored samples. We show that this method yields the same results as those of Lawless (1971, 1977) and Like μ(1974) in the case when the available sample is Type-II right censored. We present a computational algorithm for the determination of the exact percentage points of the pivotal quantities used in the construction of these prediction intervals. We also present some tables of these percentage points for the prediction of the ℓth order statistic in a sample of size n for both one- and two-parameter exponential distributions, assuming that the available sample is doubly Type-II censored. Finally, we present two examples to illustrate the methods of inference developed here.  相似文献   

5.
This paper studies the estimation in the proportional odds model based on randomly truncated data. The proposed estimators for the regression coefficients include a class of minimum distance estimators defined through weighted empirical odds function. We have investigated the asymptotic properties like the consistency and the limiting distribution of the proposed estimators under mild conditions. The finite sample properties were investigated through simulation study making comparison of some of the estimators in the class. We conclude with an illustration of our proposed method to a well-known AIDS data.  相似文献   

6.
In this paper, we consider the problem of testing the equality of two distributions when both samples are progressively Type-II censored. We discuss the following two statistics: one based on the Wilcoxon-type rank-sum precedence test, and the second based on the Kaplan–Meier estimator of the cumulative distribution function. The exact null distributions of these test statistics are derived and are then used to generate critical values and the corresponding exact levels of significance for different combinations of sample sizes and progressive censoring schemes. We also discuss their non-null distributions under Lehmann alternatives. A power study of the proposed tests is carried out under Lehmann alternatives as well as under location-shift alternatives through Monte Carlo simulations. Through this power study, it is shown that the Wilcoxon-type rank-sum precedence test performs the best.  相似文献   

7.
8.
9.
In this paper, we make use of an algorithm of Huffer and Lin (2000) in order to develop exact interval estimation for the scale parameter to of an exponential distribution based on doubly Type-II censored samples. We also evaluate the accuracy of a chi-square approximation proposed by Balakrishnan and Gupta (1998). We present the MAPLE program for the determination of the exact percentage points of the pivotal quantity based on the best linear unbiased estimator. Finally, we present a couple of examples to illustrate the method of inference developed here.  相似文献   

10.
We propose a new procedure for combining multiple tests in samples of right-censored observations. The new method is based on multiple constrained censored empirical likelihood where the constraints are formulated as linear functionals of the cumulative hazard functions. We prove a version of Wilks’ theorem for the multiple constrained censored empirical likelihood ratio, which provides a simple reference distribution for the test statistic of our proposed method. A useful application of the proposed method is, for example, examining the survival experience of different populations by combining different weighted log-rank tests. Real data examples are given using the log-rank and Gehan-Wilcoxon tests. In a simulation study of two sample survival data, we compare the proposed method of combining tests to previously developed procedures. The results demonstrate that, in addition to its computational simplicity, the combined test performs comparably to, and in some situations more reliably than previously developed procedures. Statistical software is available in the R package ‘emplik’.  相似文献   

11.
The Shapiro–Wilk statistic and modified statistics are widely used test statistics for normality. They are based on regression and correlation. The statistics for the complete data can be easily generalized to the censored data. In this paper, the distribution theory for the modified Shapiro–Wilk statistic is investigated when it is generalized to Type II right censored data. As a result, it is shown that the limit distribution of the statistic can be representable as the integral of a Brownian bridge. Also, the power comparison to the other procedure is performed.  相似文献   

12.
13.
In a recent paper in this journal, Lee, Kapadia and Brock (1980) developed maximum likelihood (ML) methods for estimating the scale parameter of the Rayleigh distribution from doubly censored samples. They reported convergence difficulties in attempting to solve numerically the nonlinear likelihood equation (LE). To mitigate these difficulties, they employed approximations to simplify the LE, but found that the solution of the resulting simplified equation can give rise to parameter estimates of erratic accuracy. We show that the use of approximations to simplify the LE is unnecessary. In fact, under suitable parametric transformation, the log-likelihood function is strictly concave, the ML estimate always exists, is unique and finite. Furthermore, the LE is easy to solve numerically. A numerical example is given to illustrate the computations involved.  相似文献   

14.
15.
We study moderate deviations for the maximum likelihood estimation of some inhomogeneous diffusions. The moderate deviation principle with explicit rate functions is obtained. Moreover, we apply our result to the parameter estimation in αα-Wiener bridges.  相似文献   

16.
A modified large-sample (MLS) approach and a generalized confidence interval (GCI) approach are proposed for constructing confidence intervals for intraclass correlation coefficients. Two particular intraclass correlation coefficients are considered in a reliability study. Both subjects and raters are assumed to be random effects in a balanced two-factor design, which includes subject-by-rater interaction. Computer simulation is used to compare the coverage probabilities of the proposed MLS approach (GiTTCH) and GCI approaches with the Leiva and Graybill [1986. Confidence intervals for variance components in the balanced two-way model with interaction. Comm. Statist. Simulation Comput. 15, 301–322] method. The competing approaches are illustrated with data from a gauge repeatability and reproducibility study. The GiTTCH method maintains at least the stated confidence level for interrater reliability. For intrarater reliability, the coverage is accurate in several circumstances but can be liberal in some circumstances. The GCI approach provides reasonable coverage for lower confidence bounds on interrater reliability, but its corresponding upper bounds are too liberal. Regarding intrarater reliability, the GCI approach is not recommended because the lower bound coverage is liberal. Comparing the overall performance of the three methods across a wide array of scenarios, the proposed modified large-sample approach (GiTTCH) provides the most accurate coverage for both interrater and intrarater reliability.  相似文献   

17.
In the present article, we have studied the estimation of entropy, that is, a function of scale parameter lnσ of an exponential distribution based on doubly censored sample when the location parameter is restricted to positive real line. The estimation problem is studied under a general class of bowl-shaped non monotone location invariant loss functions. It is established that the best affine equivariant estimator (BAEE) is inadmissible by deriving an improved estimator. This estimator is non-smooth. Further, we have obtained a smooth improved estimator. A class of estimators is considered and sufficient conditions are derived under which these estimators improve upon the BAEE. In particular, using these results we have obtained the improved estimators for the squared error and the linex loss functions. Finally, we have compared the risk performance of the proposed estimators numerically. One data analysis has been performed for illustrative purposes.  相似文献   

18.
This paper is concerned with a BAYESian construction of the prediction limits for the Weibull distribution as an example of extreme value distributions. Thus, considering Weibull and Uniform distributions for the parameters, the predictive functions, which may lead to approximative evaluation of the prediction limits, is determined by using simulation methods  相似文献   

19.
The study of differences among groups is an interesting statistical topic in many applied fields. It is very common in this context to have data that are subject to mechanisms of loss of information, such as censoring and truncation. In the setting of a two‐sample problem with data subject to left truncation and right censoring, we develop an empirical likelihood method to do inference for the relative distribution. We obtain a nonparametric generalization of Wilks' theorem and construct nonparametric pointwise confidence intervals for the relative distribution. Finally, we analyse the coverage probability and length of these confidence intervals through a simulation study and illustrate their use with a real data set on gastric cancer. The Canadian Journal of Statistics 38: 453–473; 2010 © 2010 Statistical Society of Canada  相似文献   

20.
The mean vector associated with several independent variates from the exponential subclass of Hudson (1978) is estimated under weighted squared error loss. In particular, the formal Bayes and “Stein-like” estimators of the mean vector are given. Conditions are also given under which these estimators dominate any of the “natural estimators”. Our conditions for dominance are motivated by a result of Stein (1981), who treated the Np (θ, I) case with p ≥ 3. Stein showed that formal Bayes estimators dominate the usual estimator if the marginal density of the data is superharmonic. Our present exponential class generalization entails an elliptic differential inequality in some natural variables. Actually, we assume that each component of the data vector has a probability density function which satisfies a certain differential equation. While the densities of Hudson (1978) are particular solutions of this equation, other solutions are not of the exponential class if certain parameters are unknown. Our approach allows for the possibility of extending the parametric Stein-theory to useful nonexponential cases, but the problem of nuisance parameters is not treated here.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号