首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 625 毫秒
1.
Many empirical studies are planned with the prior knowledge that some of the data may be missed. This knowledge is seldom explicitly incorporated into the experiment design process for lack of a candid methodology. This paper proposes an index related to the expected determinant of the information matrix as a criterion for planning block designs. Due to the intractable nature of the expected determinantal criterion an analytic expression is presented only for a simple 2x2 layout. A first order Taylor series approximation function is suggested for larger layouts. Ranges over which this approximation is adequate are shown via Monte Carlo simulations. The robustness of information in the block design relative to the completely randomized design with missing data is discussed.  相似文献   

2.
A new class of Bayesian estimators for a proportion in multistage binomial designs is considered. Priors belong to the beta-J distribution family, which is derived from the Fisher information associated with the design. The transposition of the beta parameters of the Haldane and the uniform priors in fixed binomial experiments into the beta-J distribution yields bias-corrected versions of these priors in multistage designs. We show that the estimator of the posterior mean based on the corrected Haldane prior and the estimator of the posterior mode based on the corrected uniform prior have good frequentist properties. An easy-to-use approximation of the estimator of the posterior mode is provided. The new Bayesian estimators are compared to Whitehead's and the uniformly minimum variance estimators through several multistage designs. Last, the bias of the estimator of the posterior mode is derived for a particular case.  相似文献   

3.
We address the problem of computing integrated mean-squared error (IMSE) optimal designs for interpolation of random fields with known mean and covariance. We assume that the mean squared error is integrated through a discrete measure and restrict the design space to its support. We show that the IMSE and its approximation by spectral truncation can be easily evaluated, which makes their global minimization affordable. Numerical experiments are carried out that illustrate the effectiveness of the approach.  相似文献   

4.
The consistency and asymptotic normality of a linear least squares estimate of the form (X'X)-X'Y when the mean is not Xβ is investigated in this paper. The least squares estimate is a consistent estimate of the best linear approximation of the true mean function for the design chosen. The asymptotic normality of the least squares estimate depends on the design and the asymptotic mean may not be the best linear approximation of the true mean function. Choices of designs which allow large sample inferences to be made about the best linear approximation of the true mean function are discussed.  相似文献   

5.
Supersaturated designs are factorial designs in which the number of potential effects is greater than the run size. They are commonly used in screening experiments, with the aim of identifying the dominant active factors with low cost. However, an important research field, which is poorly developed, is the analysis of such designs with non-normal response. In this article, we develop a variable selection strategy, through the modification of the PageRank algorithm, which is commonly used in the Google search engine for ranking Webpages. The proposed method incorporates an appropriate information theoretical measure into this algorithm and as a result, it can be efficiently used for factor screening. A noteworthy advantage of this procedure is that it allows the use of supersaturated designs for analyzing discrete data and therefore a generalized linear model is assumed. As it is depicted via a thorough simulation study, in which the Type I and Type II error rates are computed for a wide range of underlying models and designs, the presented approach can be considered quite advantageous and effective.  相似文献   

6.
Box-Behnken designs are popular with experimenters who wish to estimate a second-order model, due to their having three levels, their simplicity and their high efficiency for the second-order model. However, there are situations in which the model is inadequate due to lack of fit caused by higher-order terms. These designs have little ability to estimate third-order terms. Using combinations of factorial points, axial points, and complementary design points, we augment these designs and develop catalogues of third-order designs for 3–12 factors. These augmented designs can be used to estimate the parameters of a third-order response surface model. Since the aim is to make the most of a situation in which the experiment was designed for an inadequate model, the designs are clearly suboptimal and not rotatable for the third-order model, but can still provide useful information.  相似文献   

7.
The authors show how an adjusted pseudo‐empirical likelihood ratio statistic that is asymptotically distributed as a chi‐square random variable can be used to construct confidence intervals for a finite population mean or a finite population distribution function from complex survey samples. They consider both non‐stratified and stratified sampling designs, with or without auxiliary information. They examine the behaviour of estimates of the mean and the distribution function at specific points using simulations calling on the Rao‐Sampford method of unequal probability sampling without replacement. They conclude that the pseudo‐empirical likelihood ratio confidence intervals are superior to those based on the normal approximation, whether in terms of coverage probability, tail error rates or average length of the intervals.  相似文献   

8.
Supersaturated designs are a large class of factorial designs which can be used for screening out the important factors from a large set of potentially active variables. The huge advantage of these designs is that they reduce the experimental cost drastically, but their critical disadvantage is the confounding involved in the statistical analysis. In this article, we propose a method for analyzing data using several types of supersaturated designs. Modifications of widely used information criteria are given and applied to the variable selection procedure for the identification of the active factors. The effectiveness of the proposed method is depicted via simulated experiments and comparisons.  相似文献   

9.
We formulate in a reasonable sense a class of optimality functionals for comparing feasible statistical designs available in a given setup. It is desired that the optimality functionals reflect symmetric measures of the lack of information contained in the designs being compared. In view of this, Kiefer's (1975) universal optimality criterion is seen to rest on stringent conditions, some of which can be relaxed while preserving optimality (in an extended sense) of the so-called balanced designs.  相似文献   

10.
ABSTRACT

When spatial variation is present in experiments, it is clearly sensible to use designs with favorable properties under both generalized and ordinary least squares. This will make the statistical analysis more robust to misspecification of the spatial model than would be the case if designs were based solely on generalized least squares. In this article, treatment information is introduced as a way of studying the ordinary least squares properties of designs. The treatment information is separated into orthogonal frequency or polynomial components which are assumed to be independent under the spatial model. The well-known trend-resistant designs are those with no treatment information at the very low order frequency or polynomial components which tend to have the higher variances under the spatial model. Ideally, designs would be chosen with all the treatment information distributed at the higher-order components. However, the results in this article show that there are limits on how much trend resistance can be achieved as there are many constraints on the treatment information. In addition, appropriately chosen Williams squares designs are shown to have favorable properties under both ordinary and generalized least squares. At all times, the ordinary least squares properties of the designs are balanced against the generalized least squares objectives of optimizing neighbor balance.  相似文献   

11.
Incorporating historical data has a great potential to improve the efficiency of phase I clinical trials and to accelerate drug development. For model-based designs, such as the continuous reassessment method (CRM), this can be conveniently carried out by specifying a “skeleton,” that is, the prior estimate of dose limiting toxicity (DLT) probability at each dose. In contrast, little work has been done to incorporate historical data into model-assisted designs, such as the Bayesian optimal interval (BOIN), Keyboard, and modified toxicity probability interval (mTPI) designs. This has led to the misconception that model-assisted designs cannot incorporate prior information. In this paper, we propose a unified framework that allows for incorporating historical data into model-assisted designs. The proposed approach uses the well-established “skeleton” approach, combined with the concept of prior effective sample size, thus it is easy to understand and use. More importantly, our approach maintains the hallmark of model-assisted designs: simplicity—the dose escalation/de-escalation rule can be tabulated prior to the trial conduct. Extensive simulation studies show that the proposed method can effectively incorporate prior information to improve the operating characteristics of model-assisted designs, similarly to model-based designs.  相似文献   

12.
A factorial design can be uniquely determined by an indicator function which is constructed by means of orthogonal contrasts. Since the orthogonal contrasts are not unique, invariant measures are preferred. However, some particular orthogonal contrasts may express more information about designs than the others and be worth our attention. In this paper, a kind of indicator function based on orthogonal complex contrasts is introduced to represent general factorial designs and its significance on projection designs is presented. Based on this function, a generalized resolution and a new aberration criterion are developed to rank combinatorially non-isomorphic designs with prime levels. Some results and comparison are provided by means of examples.  相似文献   

13.
The design of large‐scale field trials where the residuals are correlated has been of recent interest, in large part because of advances in statistical and computational methods of analysis. The construction of designs for correlated data has typically used A‐optimality and is computationally intensive. This involves calculating the inverse of the information matrix for treatments under the supervision of an optimization strategy that explores the design space. We propose an approximation to A‐optimality, using nearest‐neighbour balance, that is less computationally demanding and can achieve at least 95% efficiency relative to A‐optimality in many practical situations.  相似文献   

14.
The problem of designing an experiment to estimate the point at which a quadratic regression is a maximum, or minimum. is studied. The efficiency of a design depends on the value of the unknown parameters and sequential design is, therefore, more efficient than non-sequential design. We use a Bayesian criterion which is a weighted trace of the inverse of the information matrix with the weights depending on a prior distribution. If design occurs sequentially the weights can be updated. Both sequential and non-sequential Bayesian designs are compared to non-Bayesian sequential designs. The comparison is both theoretical and by simulation.  相似文献   

15.
T max and C max are important pharmacokinetic parameters in drug development processes. Often a nonparametric procedure is needed to estimate them when model independence is required. This paper proposes a simulation-based optimal design procedure for finding optimal sampling times for nonparametric estimates of T max and C max for each subject, assuming that the drug concentration follows a non-linear mixed model. The main difficulty of using standard optimal design procedures is that the property of the nonparametric estimate is very complicated. This procedure uses a sample reuse simulation to calculate the design criterion, which is an integral of multiple dimension, so that effective optimization procedures such as Newton-type procedures can be used directly to find optimal designs. This procedure is used to construct optimal designs for an open one-compartment model. An approximation based on the Taylor expansion is also derived and showed results that were consistent with those based on the sample reuse simulation.  相似文献   

16.
A partially balanced nested row-column design, referred to as PBNRC, is defined as an arrangement of v treatments in b p × q blocks for which, with the convention that p q, the information matrix for the estimation of treatment parameters is equal to that of the column component design which is itself a partially balanced incomplete block design. In this paper, previously known optimal incomplete block designs, and row-column and nested row-column designs are utilized to develop some methods of constructing optimal PBNRC designs. In particular, it is shown that an optimal group divisible PBNRC design for v = mn kn treatments in p × q blocks can be constructed whenever a balanced incomplete block design for m treatments in blocks of size k each and a group divisible PBNRC design for kn treatments in p × q blocks exist. A simple sufficient condition is given under which a group divisible PBNRC is Ψf-better for all f> 0 than the corresponding balanced nested row-column designs having binary blocks. It is also shown that the construction techniques developed particularly for group divisible designs can be generalized to obtain PBNRC designs based on rectangular association schemes.  相似文献   

17.
According to investigated topic in the context of optimal designs, various methods can be used to obtain optimal design, of which Bayesian method is one. In this paper, considering the model and the features of the information matrix, this method (Bayesian optimality criterion) has been used for obtaining optimal designs which due to the variation range of the model parameters, prior distributions such as Uniform, Normal and Exponential have been used and the results analysed.  相似文献   

18.
This paper discusses two problems, which can occur when using central composite designs (CCDs), that are not generally covered in the literature but can lead to wrong decisions-and therefore incorrect models-if they are ignored. Most industrialbased experimental designs are sequential. This usually involves running as few initial tests as possible, while getting enough information as is needed to provide a reasonable approximation to reality (the screening stage). The CCD design strategy generally requires the running of a full or fractional factorial design (the cube or hypercube) with one or more additional centre points. The cube is augmented, if deemed necessary, by additional experiments known as star-points. The major problems highlighted here concern the decision to run the star points or not. If the difference between the average response at the centre of the design and the average of the cube results is significant, there is probably a need for one or more quadratic terms in the predictive model. If not, then a simpler model that includes only main effects and interactions is usually considered sufficient. This test for 'curvature' in a main effect will often fail if the design space contains or surrounds a saddle-point. Such a point may disguise the need for a quadratic term. This paper describes the occurrence of a real saddle-point from an industrial project and how this was overcome. The second problem occurs because the cube and star point portions of a CCD are sometimes run as orthogonal blocks. Indeed, theory would suggest that this is the correct procedure. However in the industrial context, where minimizing the total number of tests is at a premium, this can lead to designs with star points a long way from the cube. In such a situation, were the curvature test to be found non-significant, we could end with a model that predicted well within the cube portion of the design space but that would be unreliable in the balance of the total area of investigation. The paper discusses just such a design, one that disguised the real need for a quadratic term.  相似文献   

19.
This paper discusses two problems, which can occur when using central composite designs (CCDs), that are not generally covered in the literature but can lead to wrong decisions-and therefore incorrect models-if they are ignored. Most industrialbased experimental designs are sequential. This usually involves running as few initial tests as possible, while getting enough information as is needed to provide a reasonable approximation to reality (the screening stage). The CCD design strategy generally requires the running of a full or fractional factorial design (the cube or hypercube) with one or more additional centre points. The cube is augmented, if deemed necessary, by additional experiments known as star-points. The major problems highlighted here concern the decision to run the star points or not. If the difference between the average response at the centre of the design and the average of the cube results is significant, there is probably a need for one or more quadratic terms in the predictive model. If not, then a simpler model that includes only main effects and interactions is usually considered sufficient. This test for 'curvature' in a main effect will often fail if the design space contains or surrounds a saddle-point. Such a point may disguise the need for a quadratic term. This paper describes the occurrence of a real saddle-point from an industrial project and how this was overcome. The second problem occurs because the cube and star point portions of a CCD are sometimes run as orthogonal blocks. Indeed, theory would suggest that this is the correct procedure. However in the industrial context, where minimizing the total number of tests is at a premium, this can lead to designs with star points a long way from the cube. In such a situation, were the curvature test to be found non-significant, we could end with a model that predicted well within the cube portion of the design space but that would be unreliable in the balance of the total area of investigation. The paper discusses just such a design, one that disguised the real need for a quadratic term.  相似文献   

20.
Mehrotra (1997) presented an ‘;improved’ Brown and Forsythe (1974) statistic which is designed to provide a valid test of mean equality in independent groups designs when variances are heterogeneous. In particular, the usual Brown and Fosythe procedure was modified by using a Satterthwaite approximation for numerator degrees of freedom instead of the usual value of number of groups minus one. Mehrotra then, through Monte Carlo methods, demonstrated that the ‘improved’ method resulted in a robust test of significance in cases where the usual Brown and Forsythe method did not. Accordingly, this ‘improved’ procedure was recommended. We show that under conditions likely to be encountered in applied settings, that is, conditions involving heterogeneous variances as well as nonnormal data, the ‘improved’ Brown and Forsythe procedure results in depressed or inflated rates of Type I error in unbalanced designs. Previous findings indicate, however, that one can obtain a robust test by adopting a heteroscedastic statistic with the robust estimators, rather than the usual least squares estimators, and further improvement can be expected when critical significance values are obtained through bootstrapping methods.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号