首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到8条相似文献,搜索用时 15 毫秒
1.
The test of the hypothesis of equal means of two normal populations without assumption on the variances is usually referred to as the Behrens–Fisher Problem. Exact similar tests are known not to exist. However, excellent approximately similar “solutions” are readily available. Of these available tests and corresponding critical regions, those due to Welch, Aspin, and Trickett in the 1940s and 1950s come closest to achieving similarity. This article examines numerically the Welch–Aspin asymptotic series and the related Trickett–Welch integral equation formulations of this problem. Through examples, we illustrate that well-behaved tests can deviate from similarity by an almost incredibly small amount. Despite this, with much more extensive computation than was feasible a half-century ago, we can see irregularities which could be an empirical reflection of the known nonexistance of exact solutions.  相似文献   

2.
In constructing exact tests from discrete data, one must deal with the possible dependence of the P‐value on nuisance parameter(s) ψ as well as the discreteness of the sample space. A classical but heavy‐handed approach is to maximize over ψ. We prove what has previously been understood informally, namely that maximization produces the unique and smallest possible P‐value subject to the ordering induced by the underlying test statistic and test validity. On the other hand, allowing for the worst case will be more attractive when the P‐value is less dependent on ψ. We investigate the extent to which estimating ψ under the null reduces this dependence. An approach somewhere between full maximization and estimation is partial maximization, with appropriate penalty, as introduced by Berger & Boos (1994, P values maximized over a confidence set for the nuisance parameter. J. Amer. Statist. Assoc. 89 , 1012–1016). It is argued that estimation followed by maximization is an attractive, but computationally more demanding, alternative to partial maximization. We illustrate the ideas on a range of low‐dimensional but important examples for which the alternative methods can be investigated completely numerically.  相似文献   

3.
We propose novel parametric concentric multi‐unimodal small‐subsphere families of densities for p ? 1 ≥ 2‐dimensional spherical data. Their parameters describe a common axis for K small hypersubspheres, an array of K directional modes, one mode for each subsphere, and K pairs of concentrations parameters, each pair governing horizontal (within the subsphere) and vertical (orthogonal to the subsphere) concentrations. We introduce two kinds of distributions. In its one‐subsphere version, the first kind coincides with a special case of the Fisher–Bingham distribution, and the second kind is a novel adaption that models independent horizontal and vertical variations. In its multisubsphere version, the second kind allows for a correlation of horizontal variation over different subspheres. In medical imaging, the situation of p ? 1 = 2 occurs precisely in modeling the variation of a skeletally represented organ shape due to rotation, twisting, and bending. For both kinds, we provide new computationally feasible algorithms for simulation and estimation and propose several tests. To the best knowledge of the authors, our proposed models are the first to treat the variation of directional data along several concentric small hypersubspheres, concentrated near modes on each subsphere, let alone horizontal dependence. Using several simulations, we show that our methods are more powerful than a recent nonparametric method and ad hoc methods. Using data from medical imaging, we demonstrate the advantage of our method and infer on the dominating axis of rotation of the human knee joint at different walking phases.  相似文献   

4.
5.
We consider fitting the so‐called Emax model to continuous response data from clinical trials designed to investigate the dose–response relationship for an experimental compound. When there is insufficient information in the data to estimate all of the parameters because of the high dose asymptote being ill defined, maximum likelihood estimation fails to converge. We explore the use of either bootstrap resampling or the profile likelihood to make inferences about effects and doses required to give a particular effect, using limits on the parameter values to obtain the value of the maximum likelihood when the high dose asymptote is ill defined. The results obtained show these approaches to be comparable with or better than some others that have been used when maximum likelihood estimation fails to converge and that the profile likelihood method outperforms the method of bootstrap resampling used. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

6.
In this paper, we propose and study a new global test, namely, GPF test, for the one‐way anova problem for functional data, obtained via globalizing the usual pointwise F‐test. The asymptotic random expressions of the test statistic are derived, and its asymptotic power is investigated. The GPF test is shown to be root‐n consistent. It is much less computationally intensive than a parametric bootstrap test proposed in the literature for the one‐way anova for functional data. Via some simulation studies, it is found that in terms of size‐controlling and power, the GPF test is comparable with two existing tests adopted for the one‐way anova problem for functional data. A real data example illustrates the GPF test.  相似文献   

7.
This article investigates the Farlie–Gumbel–Morgenstern class of models for exchangeable continuous data. We show how the model specification can account for both individual and cluster level covariates, we derive insights from comparisons with the multivariate normal distribution, and we discuss maximum likelihood inference when a sample of independent clusters of varying sizes is available. We propose a method for maximum likelihood estimation which is an alternative to direct numerical maximization of the likelihood that sometimes exhibits non-convergence problems. We describe an algorithm for generating samples from the exchangeable multivariate Farlie–Gumbel–Morgenstern distribution with any marginals, using the structural properties of the distribution. Finally, we present the results of a simulation study designed to assess the properties of the maximum likelihood estimators, and we illustrate the use of the FGM distributions with the analysis of a small data set from a developmental toxicity study.  相似文献   

8.
In recent years, immunological science has evolved, and cancer vaccines are available for treating existing cancers. Because cancer vaccines require time to elicit an immune response, a delayed treatment effect is expected. Accordingly, the use of weighted log‐rank tests with the Fleming–Harrington class of weights is proposed for evaluation of survival endpoints. We present a method for calculating the sample size under assumption of a piecewise exponential distribution for the cancer vaccine group and an exponential distribution for the placebo group as the survival model. The impact of delayed effect timing on both the choice of the Fleming–Harrington's weights and the increment in the required number of events is discussed. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号