首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 23 毫秒
1.
We propose a procedure to identify a lowest dose having greater effect than a threshold dose under the assumption of monotonicity of dose mean response in dose response test. So, we use statistics based on contrasts among sample means and apply a group sequential procedure to our procedure to identify effectively the dose. If we can identify the dose at an early step in the sequential test, since we can terminate the procedure with a few observations, the procedure is useful from an economical point of view. In a simulation studies, we compare the superiority among these procedures based on three contrasts.  相似文献   

2.
For testing the non-inferiority (or equivalence) of an experimental treatment to a standard treatment, the odds ratio (OR) of patient response rates has been recommended to measure the relative treatment efficacy. On the basis of an exact test procedure proposed elsewhere for a simple crossover design, we develop an exact sample-size calculation procedure with respect to the OR of patient response rates for a desired power of detecting non-inferiority at a given nominal type I error. We note that the sample size calculated for a desired power based on an asymptotic test procedure can be much smaller than that based on the exact test procedure under a given situation. We further discuss the advantage and disadvantage of sample-size calculation using the exact test and the asymptotic test procedures. We employ an example by studying two inhalation devices for asthmatics to illustrate the use of sample-size calculation procedure developed here.  相似文献   

3.
Assuming that the frequency of occurrence follows the Poisson distribution, we develop sample size calculation procedures for testing equality based on an exact test procedure and an asymptotic test procedure under an AB/BA crossover design. We employ Monte Carlo simulation to demonstrate the use of these sample size formulae and evaluate the accuracy of sample size calculation formula derived from the asymptotic test procedure with respect to power in a variety of situations. We note that when both the relative treatment effect of interest and the underlying intraclass correlation between frequencies within patients are large, the sample size calculation based on the asymptotic test procedure can lose accuracy. In this case, the sample size calculation procedure based on the exact test is recommended. On the other hand, if the relative treatment effect of interest is small, the minimum required number of patients per group will be large, and the asymptotic test procedure will be valid for use. In this case, we may consider use of the sample size calculation formula derived from the asymptotic test procedure to reduce the number of patients needed for the exact test procedure. We include an example regarding a double‐blind randomized crossover trial comparing salmeterol with a placebo in exacerbations of asthma to illustrate the practical use of these sample size formulae. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

4.
Abstract.  A new multiple testing procedure, the generalized augmentation procedure (GAUGE), is introduced. The procedure is shown to control the false discovery exceedance and to be competitive in terms of power. It is also shown how to apply the idea of GAUGE to achieve control of other error measures. Extensions to dependence are discussed, together with a modification valid under arbitrary dependence. We present an application to an original study on prostate cancer and on a benchmark data set on colon cancer.  相似文献   

5.
A Bayesian discovery procedure   总被引:1,自引:0,他引:1  
Summary.  We discuss a Bayesian discovery procedure for multiple-comparison problems. We show that, under a coherent decision theoretic framework, a loss function combining true positive and false positive counts leads to a decision rule that is based on a threshold of the posterior probability of the alternative. Under a semiparametric model for the data, we show that the Bayes rule can be approximated by the optimal discovery procedure, which was recently introduced by Storey. Improving the approximation leads us to a Bayesian discovery procedure, which exploits the multiple shrinkage in clusters that are implied by the assumed non-parametric model. We compare the Bayesian discovery procedure and the optimal discovery procedure estimates in a simple simulation study and in an assessment of differential gene expression based on microarray data from tumour samples. We extend the setting of the optimal discovery procedure by discussing modifications of the loss function that lead to different single-thresholding statistics. Finally, we provide an application of the previous arguments to dependent (spatial) data.  相似文献   

6.
This paper deals with an asymptotic distribution-free subset selection procedure for a two-way layout problem. The treatment effect with the largest unknown value is of interest to us. The block effect is a nuisance parameter in this problem. The proposed procedure is based on the Hodges-Lehmann estimators of location parameters. The asymptotic relative efficiency of the proposed procedure with the normal means procedure is evaluated. It is shown that the proposed procedure has a high efficiency.  相似文献   

7.
An algorithm is presented for computing an exact nonparametric interval estimate of the slope parameter in a simple linear regression model. The confidence interval is obtained by inverting the hypothesis test for slope that uses Spearman's rho. This method is compared to an exact procedure based on Kendall's tau. The Spearman rho procedure will generally give exact levels of confidence closer to desired levels, especially in small samples. Monte carlo results comparing these two methods with the parametric procedure are given  相似文献   

8.
In this article, we propose a unified sequentially rejective test procedure for testing simultaneously the equality of several independent binomial proportions to a specified standard. The proposed test procedure is general enough to include some well-known multiple testing procedures such as the Ordinary Bonferroni procedure, Hochberg procedure and Rom procedure. It involves multiple tests of significance based on the simple binomial tests (exact or approximate) which can be easily found in many elementary standard statistics textbooks. Unlike the traditional Chi-square test of the overall hypothesis, the procedure can identify the subset of the binomial proportions, which are different from the prespecified standard with the control of the familywise type I error rate. Moreover, the power computation of the procedure is provided and the procedure is illustrated by two real examples from an ecological study and a carcinogenicity study.  相似文献   

9.
The purpose of toxicological studies is a safety assessment of compounds (e.g. pesticides, pharmaceuticals, industrial chemicals and food additives) at various dose levels. Because a mistaken declaration that a really non-equivalent dose is equivalent could have dangerous consequences, it is important to adopt reliable statistical methods that can properly control the family-wise error rate. We propose a new stepwise confidence interval procedure for toxicological evaluation based on an asymmetric loss function. The new procedure is shown to be reliable in the sense that the corresponding family-wise error rate is well controlled at or below the pre-specified nominal level. Our simulation results show that the new procedure is to be preferred over the classical confidence interval procedure and the stepwise procedure based on Welch's approximation in terms of practical equivalence/safety. The implementation and significance of the new procedure are illustrated with two real data sets: one from a reproductive toxicological study on Nitrofurazone in Swiss CD-1 mice, and the other from a toxicological study on Aconiazide.  相似文献   

10.
ROC curve is a graphical representation of the relationship between sensitivity and specificity of a diagnostic test. It is a popular tool for evaluating and comparing different diagnostic tests in medical sciences. In the literature,the ROC curve is often estimated empirically based on an empirical distribution function estimator and an empirical quantile function estimator. In this paper an alternative nonparametric procedure to estimate the ROC Curve is suggested which is based on local smoothing techniques. Several numerical examples are presented to evaluate the performance of this procedure.  相似文献   

11.
Abstract: The authors address the problem of estimating an inter‐event distribution on the basis of count data. They derive a nonparametric maximum likelihood estimate of the inter‐event distribution utilizing the EM algorithm both in the case of an ordinary renewal process and in the case of an equilibrium renewal process. In the latter case, the iterative estimation procedure follows the basic scheme proposed by Vardi for estimating an inter‐event distribution on the basis of time‐interval data; it combines the outputs of the E‐step corresponding to the inter‐event distribution and to the length‐biased distribution. The authors also investigate a penalized likelihood approach to provide the proposed estimation procedure with regularization capabilities. They evaluate the practical estimation procedure using simulated count data and apply it to real count data representing the elongation of coffee‐tree leafy axes.  相似文献   

12.
The analysis of data from eseperisental designs is often hampered by the lack of more than one procedure available for the analysis, especially when that procedure is based on assumptions which do not apply in the situation at hand. In this paper tvo classes of alternative procedures are discussed and compared, One is the aligned ranks procedure which first standardises the data by subtracting an appropriate estimate of location, then replaces the data with ranks t and finally uses an appropriate test statistic which has asymptotically a chi-square distribution The second procedure is the rank transform which first replaces all of the data with the ranks, and then employs the usual parametric methods, but computed on the ranks instead of the data Some Monte Carlo simulations for a test of interaction in a two way layout with replication enable the robustness and pover of these tvo methods to be compared with the usual analysis of variancs.  相似文献   

13.
T max and C max are important pharmacokinetic parameters in drug development processes. Often a nonparametric procedure is needed to estimate them when model independence is required. This paper proposes a simulation-based optimal design procedure for finding optimal sampling times for nonparametric estimates of T max and C max for each subject, assuming that the drug concentration follows a non-linear mixed model. The main difficulty of using standard optimal design procedures is that the property of the nonparametric estimate is very complicated. This procedure uses a sample reuse simulation to calculate the design criterion, which is an integral of multiple dimension, so that effective optimization procedures such as Newton-type procedures can be used directly to find optimal designs. This procedure is used to construct optimal designs for an open one-compartment model. An approximation based on the Taylor expansion is also derived and showed results that were consistent with those based on the sample reuse simulation.  相似文献   

14.
We consider a semi-parametric approach to perform the joint segmentation of multiple series sharing a common functional part. We propose an iterative procedure based on Dynamic Programming for the segmentation part and Lasso estimators for the functional part. Our Lasso procedure, based on the dictionary approach, allows us to both estimate smooth functions and functions with local irregularity, which permits more flexibility than previous proposed methods. This yields to a better estimation of the functional part and improvements in the segmentation. The performance of our method is assessed using simulated data and real data from agriculture and geodetic studies. Our estimation procedure results to be a reliable tool to detect changes and to obtain an interpretable estimation of the functional part of the model in terms of known functions.  相似文献   

15.
This paper proposes a method for estimating the parameters in a generalized linear model with missing covariates. The missing covariates are assumed to come from a continuous distribution, and are assumed to be missing at random. In particular, Gaussian quadrature methods are used on the E-step of the EM algorithm, leading to an approximate EM algorithm. The parameters are then estimated using the weighted EM procedure given in Ibrahim (1990). This approximate EM procedure leads to approximate maximum likelihood estimates, whose standard errors and asymptotic properties are given. The proposed procedure is illustrated on a data set.  相似文献   

16.
Summary.  A typical microarray experiment attempts to ascertain which genes display differential expression in different samples. We model the data by using a two-component mixture model and develop an empirical Bayesian thresholding procedure, which was originally introduced for thresholding wavelet coefficients, as an alternative to the existing methods for determining differential expression across thousands of genes. The method is built on sound theoretical properties and has easy computer implementation in the R statistical package. Furthermore, we consider improvements to the standard empirical Bayesian procedure when replication is present, to increase the robustness and reliability of the method. We provide an introduction to microarrays for those who are unfamilar with the field and the proposed procedure is demonstrated with applications to two-channel complementary DNA microarray experiments.  相似文献   

17.
In this paper, we focus on the variable selection for the semiparametric regression model with longitudinal data when some covariates are measured with errors. A new bias-corrected variable selection procedure is proposed based on the combination of the quadratic inference functions and shrinkage estimations. With appropriate selection of the tuning parameters, we establish the consistency and asymptotic normality of the resulting estimators. Extensive Monte Carlo simulation studies are conducted to examine the finite sample performance of the proposed variable selection procedure. We further illustrate the proposed procedure with an application.  相似文献   

18.
The central limit theorem says that, provided an estimator fulfills certain weak conditions, then, for reasonable sample sizes, the sampling distribution of the estimator converges to normality. We propose a procedure to find out what a “reasonably large sample size” is. The procedure is based on the properties of Gini's mean difference decomposition. We show the results of implementations of the procedure from simulated datasets and data from the German Socio-economic Panel.  相似文献   

19.
The analysis of covariance procedure is considered when the observations in each cell are equicorrelated. A correction procedure is given, A computationally easier conservative test statistic is also given. The conservative test statistic allows one to more readily determine the consequences of ignoring correlations, even slight correlations, in the analysis of covariance procedure. The difference of the corrected test and the conservative test is shown to converge in probability to zero. This conservative test is easy to implement on statistical computer packages, It is shown, that for the general correlation pattern, any test involving the regression coefficients of the covariables is an exact test. An example illustrates the procedure  相似文献   

20.
With linear dispersion effects, the standard factorial designs are not optimal estimation of a mean model. A sequential two-stage experimental design procedure has been proposed that first estimates the variance structure, and then uses the variance estimates and the variance optimality criterion to develop a second stage design that efficiency estimates the mean model. This procedure has been compared to an equal replicate design analyzed by ordinary least squares, and found to be a superior procedure in many situations.

However with small first stage sample sizes the variance estiamtes are not reliable, and hence an alternative procedure could be more beneficial. For this reason a Bayesian modification to the two-stage procedure is proposed which will combine the first stage variance estiamtes with some prior variance information that will produce a more efficient procedure. This Bayesian procedure will be compared to the non-Bayesian twostage procedure and to the two one-stage alternative procedures listed above. Finally, a recommendation will be made as to which procedure is preferred in certain situations.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号