首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In this note, it is proved that in computing percentage points of a distribution, Halley's method yields convergent solutions under most conditions, and we demonstrate the efficiency of the method with some examples.  相似文献   

2.
Use of Newton's method for computing the noncentrality parameter based on the specified power in sample size problems of chi-squared tests requires that we evaluate both the noncentral chi-squareddistribution function and its derivative with respect to the noncentrality parameter. A close relationship between computing formulas for them is revealed, by which their evaluations can be performed jointly. This property greatly reduces the amount of computation involved. The corresponding algorithm is provided in a step-by-step form.  相似文献   

3.
This article given an efficient computer algorithm for a certain nonparametric regression method based on Kendall's rank correlation statistics. The method applies to experimental designs for which the set of covariates exhibits certain orthogonality properties, and the dependent variables is continuous. Testing, point and interval estimation, and ties are discussed.  相似文献   

4.
Two approximation methods are used to obtain the Bayes estimate for the renewal function of inverse Gaussian renewal process. Both approximations use a gamma-type conditional prior for the location parameter, a non-informative marginal prior for the shape parameter, and a squared error loss function. Simulations compare the accuracy of the estimators and indicate that the Tieney and Kadane (T–K)-based estimator out performs Maximum Likelihood (ML)- and Lindley (L)-based estimator. Computations for the T–K-based Bayes estimate employ the generalized Newton's method as well as a recent modified Newton's method with cubic convergence to maximize modified likelihood functions. The program is available from the author.  相似文献   

5.
Receiver Operating Characteristic curves and the Area Under Curve (AUC) are widely used to evaluate the predictive accuracy of diagnostic tests. The parametric methods of estimating AUCs are well established while nonparametric methods, such as Wilcoxon's method, lack proper research. This study considered three standard error techniques, namely, Hanley and McNeil, Hanley and Tilaki, and DeLong methods. Several parameters were considered, while measuring the predictor on a binary scale. The normality and type I error rate was violated for Hanley and McNeil's method while asymptotically DeLong's method performed better. Hanley and Tilaki's Jackknife method and DeLong's method performed equally well.  相似文献   

6.
In this paper a new method called the EMS algorithm is used to solve Wicksell's corpuscle problem, that is the determination of the distribution of the sphere radii in a medium given the radii of their profiles in a random slice. The EMS algorithm combines the EM algorithm, a procedure for obtaining maximum likelihood estimates of parameters from incomplete data, with simple smoothing. The method is tested on simulated data from three different sphere radii densities, namely a bimodal mixture of Normals, a Weibull and a Normal. The effect of varying the level of smoothing, the number of classes in which the data is binned and the number of classes for which the estimated density is evaluated, is investigated. Comparisons are made between these results and those obtained by others in this field.  相似文献   

7.
In this paper we consider a simple linear regression model under heteroscedasticity and nonnormality. A statistical test for testing the regression coefficient is then derived by assuming normality for the random disturbances and by applying Welch's method. Some Monte Carlo studies are generated for assessing robustness of this test. By combining Tiku's robust procedure with the new test, a robust but more powerful test is developed.  相似文献   

8.
We respond to criticism leveled at bootstrap confidence intervals for the correlation coefficient by recent authors by arguing that in the correlation coefficient case, non–standard methods should be employed. We propose two such methods. The first is a bootstrap coverage coorection algorithm using iterated bootstrap techniques (Hall, 1986; Beran, 1987a; Hall and Martin, 1988) applied to ordinary percentile–method intervals (Efron, 1979), giving intervals with high coverage accuracy and stable lengths and endpoints. The simulation study carried out for this method gives results for sample sizes 8, 10, and 12 in three parent populations. The second technique involves the construction of percentile–t bootstrap confidence intervals for a transformed correlation coefficient, followed by an inversion of the transformation, to obtain “transformed percentile–t” intervals for the correlation coefficient. In particular, Fisher's z–transformation is used, and nonparametric delta method and jackknife variance estimates are used to Studentize the transformed correlation coefficient, with the jackknife–Studentized transformed percentile–t interval yielding the better coverage accuracy, in general. Percentile–t intervals constructed without first using the transformation perform very poorly, having large expected lengths and erratically fluctuating endpoints. The simulation study illustrating this technique gives results for sample sizes 10, 15 and 20 in four parent populations. Our techniques provide confidence intervals for the correlation coefficient which have good coverage accuracy (unlike ordinary percentile intervals), and stable lengths and endpoints (unlike ordinary percentile–t intervals).  相似文献   

9.
The authors describe Bayesian estimation for the parameters of the bivariate gamma distribution due to Kibble (1941). The density of this distribution can be written as a mixture, which allows for a simple data augmentation scheme. The authors propose a Markov chain Monte Carlo algorithm to facilitate estimation. They show that the resulting chain is geometrically ergodic, and thus a regenerative sampling procedure is applicable, which allows for estimation of the standard errors of the ergodic means. They develop Bayesian hypothesis testing procedures to test both the dependence hypothesis of the two variables and the hypothesis of equal means. They also propose a reversible jump Markov chain Monte Carlo algorithm to carry out the model selection problem. Finally, they use sets of real and simulated data to illustrate their methodology.  相似文献   

10.
Several methods exist for testing interaction in unreplicated two-way layouts. Some are based on specifying a functional form for the interaction term and perform well provided that the functional form is appropriate. Other methods do not require such a functional form to be specified but only test for the presence of non-additivity and do not provide a suitable estimate of error variance for a non-additive model. This paper presents a method for testing for interaction in unreplicated two-way tables that is based on testing all pairwise interaction contrasts. This method (i) is easy to implement, (ii) does not assume a functional form for the interaction term, (iii) can find a sub-table of data which may be free from interaction and to base the estimate of unknown error variance, and (iv) can be used for incomplete two-way layouts. The proposed method is illustrated using examples and its power is investigated via simulation studies. Simulation results show that the proposed method is competitive with existing methods for testing for interaction in unreplicated two-way layouts.  相似文献   

11.
The balanced iterative reducing and clustering hierarchies (BIRCH) algorithm handles massive datasets by reading the data file only once, clustering the data as it is read, and retaining only a few clustering features to summarize the data read so far. Using BIRCH allows to analyse datasets that are too large to fit in the computer main memory. We propose estimates of Spearman's ρ and Kendall's τ that are calculated from a BIRCH output and assess their performance through Monte Carlo studies. The numerical results show that the BIRCH-based estimates can achieve the same efficiency as the usual estimates of ρ and τ while using only a fraction of the memory otherwise required.  相似文献   

12.
In this article, we investigated the bootstrap calibrated generalized confidence limits for process capability indices C pk for the one-way random effect model. Also, we derived Bissell's approximation formula for the lower confidence limit using Satterthwaite's method and calculated its coverage probabilities and expected values. Then we compared it with standard bootstrap (SB) method and generalized confidence interval method. The simulation results indicate that the confidence limit obtained offers satisfactory coverage probabilities. The proposed method is illustrated with the help of simulation studies and data sets.  相似文献   

13.
The multiple inference character of several tests in the same application is usually taken into consideration by requiring that the tests have a multiple level of significance. Also, a prediction problem in an application with several possible predictor variables requires that the multiple inference character of the problem be considered. This is not being done in the methods commonly used to choose predictor variables. Here, we discuss both the test and prediction methods in two-level factorial designs and suggest a principle for choosing variables which is based on multiple inference thinking. By an example use demonstrated that the principle proposed leads to the use of fewer prediction variables than does the Akaike method.  相似文献   

14.
In discriminant analysis, the dimension of the hyperplane which population mean vectors span is called the dimensionality. The procedures commonly used to estimate this dimension involve testing a sequence of dimensionality hypotheses as well as model fitting approaches based on (consistent) Akaike's method, (modified) Mallows' method and Schwarz's method. The marginal log-likelihood (MLL) method is developed and the asymptotic distribution of the dimensionality estimated by this method for normal populations is derived. Furthermore a modified marginal log-likelihood (MMLL) method is also considered. The MLL method is not consistent for large samples and two modified criteria are proposed which attain asymptotic consistency. Some comments are made with regard to the robustness of this method to departures from normality. The operating characteristics of the various methods proposed are examined and compared.  相似文献   

15.
The moment-generating function method, which is proposed by Tierney et al. [1989a. Fully exponential Laplace approximations to expectations and variances of nonpositive functions. J. Amer. Statist. Assoc. 84, 710–716], is an asymptotic technique of approximating a posterior mean of a general function by approximating the moment-generating function (MGF), and then differentiating it. In this article, we give approximations to the posterior means and variances by combining the MGF method and the Laplace approximations with asymptotic modes. We prove that asymptotic errors of the approximate means and variances are of order n-2n-2 and of order n-3n-3, respectively. Our approximation is closely related to a standard-form approximation, and is given without evaluating the exact posterior mode and third derivatives of the log-likelihood function. The MGF method also improves numerical instability of the fully exponential Laplace approximation for a predictive mean in logistic regression.  相似文献   

16.
If the experimental design (or lack of design) results in a nodel which is not of f u l l rank, the problem of variable selection becomes rather complex. Tkis is due to the fact that, in less than f u l l rank models, not every linear combination of regression parameters is estimable. In this paper, we present a procedure fortesting all “testable” subsets of a complete set of regression parameters, using a technique based on Scheffe's method (1959). A class of “adequate” subsets of regression parameters is obtained in a manner similar to that of Aitkin (1974). The proposed procedure is illustraced with an example.  相似文献   

17.
In this article, we use Stein's method and w-functions to give uniform and non uniform bounds in the geometric approximation of a non negative integer-valued random variable. We give some applications of the results of this approximation concerning the beta-geometric, Pólya, and Poisson distributions.  相似文献   

18.
The weighted kappa coefficient of a binary diagnostic test is a measure of the beyond-chance agreement between the diagnostic test and the gold standard, and is a measure that allows us to assess and compare the performance of binary diagnostic tests. In the presence of partial disease verification, the comparison of the weighted kappa coefficients of two or more binary diagnostic tests cannot be carried out ignoring the individuals with an unknown disease status, since the estimators obtained would be affected by verification bias. In this article, we propose a global hypothesis test based on the chi-square distribution to simultaneously compare the weighted kappa coefficients when in the presence of partial disease verification the missing data mechanism is ignorable. Simulation experiments have been carried out to study the type I error and the power of the global hypothesis test. The results have been applied to the diagnosis of coronary disease.  相似文献   

19.
We consider the problem of making statistical inference on unknown parameters of a lognormal distribution under the assumption that samples are progressively censored. The maximum likelihood estimates (MLEs) are obtained by using the expectation-maximization algorithm. The observed and expected Fisher information matrices are provided as well. Approximate MLEs of unknown parameters are also obtained. Bayes and generalized estimates are derived under squared error loss function. We compute these estimates using Lindley's method as well as importance sampling method. Highest posterior density interval and asymptotic interval estimates are constructed for unknown parameters. A simulation study is conducted to compare proposed estimates. Further, a data set is analysed for illustrative purposes. Finally, optimal progressive censoring plans are discussed under different optimality criteria and results are presented.  相似文献   

20.
Nonparametric methods, Theil's method and Hussain's method have been applied to simple linear regression problems for estimating the slope of the regression line.We extend these methods and propose a robust estimator to estimate the coefficient of a first order autoregressive process under various distribution shapes, A simulation study to compare Theil's estimator, Hus-sain's estimator, the least squares estimator, and the proposed estimator is also presented.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号