首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Computer experiments using deterministic simulators are sometimes used to replace or supplement physical system experiments. This paper compares designs for an initial computer simulator experiment based on empirical prediction accuracy; it recommends designs for producing accurate predictions. The basis for the majority of the designs compared is the integrated mean squared prediction error (IMSPE) that is computed assuming a Gaussian process model with a Gaussian correlation function. Designs that minimize the IMSPE with respect to a fixed set of correlation parameters as well as designs that minimize a weighted IMSPE over the correlation parameters are studied. These IMSPE-based designs are compared with three widely-used space-filling designs. The designs are used to predict test surfaces representing a range of stationary and non-stationary functions. For the test conditions examined in this paper, the designs constructed under IMSPE-based criteria are shown to outperform space-filling Latin hypercube designs and maximum projection designs when predicting smooth functions of stationary appearance, while space-filling and maximum projection designs are superior for test functions that exhibit strong non-stationarity.  相似文献   

2.
We investigate a space-filling criterion based on L 2 -type discrepancies, namely the uniform projection criterion, aiming at improving designs' two-dimensional projection uniformity. Under a general reproducing kernel, we establish a formula for the uniform projection criterion function, which builds a connection between rows and columns of the design. For the commonly used discrepancies, we further use this formula to represent the two-dimensional projection uniformity in terms of the L p -distances of U-type designs. These results generalize existing works and reveal new links between the two seemingly unrelated criteria of projection uniformity and the maximin L p -distance for U-type designs. We also apply the obtained results to study several families of space-filling designs with appealing projection uniformity. Because of good projected space-filling properties, these designs are well adapted for computer experiments, especially for the case where not all the input factors are active.  相似文献   

3.
《Statistics》2012,46(6):1357-1385
ABSTRACT

The early stages of many real-life experiments involve a large number of factors among which only a few factors are active. Unfortunately, the optimal full-dimensional designs of those early stages may have bad low-dimensional projections and the experimenters do not know which factors turn out to be important before conducting the experiment. Therefore, designs with good projections are desirable for factor screening. In this regard, significant questions are arising such as whether the optimal full-dimensional designs have good projections onto low dimensions? How experimenters can measure the goodness of a full-dimensional design by focusing on all of its projections?, and are there linkages between the optimality of a full-dimensional design and the optimality of its projections? Through theoretical justifications, this paper tries to provide answers to these interesting questions by investigating the construction of optimal (average) projection designs for screening either nominal or quantitative factors. The main results show that: based on the aberration and orthogonality criteria the full-dimensional design is optimal if and only if it is optimal projection design; the full-dimensional design is optimal via the aberration and orthogonality if and only if it is uniform projection design; there is no guarantee that a uniform full-dimensional design is optimal projection design via any criterion; the projection design is optimal via the aberration, orthogonality and uniformity criteria if it is optimal via any criterion of them; and the saturated orthogonal designs have the same average projection performance.  相似文献   

4.
A procedure for using a random sample to estimate the entropy of the sampled distribution is developed. It is based on one or two power transformations to a symmetric distribution which has the maximum entropy in a certain class. The procedure's potential for removing strong negative bias in previously proposed entropy estimators is demonstrated by the results of a Monte Carlo study.  相似文献   

5.
Space-filling designs are commonly used for selecting the input values of time-consuming computer codes. Computer experiment context implies two constraints on the design. First, the design points should be evenly spread throughout the experimental region. A space-filling criterion (for instance, the maximin distance) is used to build optimal designs. Second, the design should avoid replication when projecting the points onto a subset of input variables (non-collapsing). The Latin hypercube structure is often enforced to ensure good projective properties. In this paper, a space-filling criterion based on the Kullback–Leibler information is used to build a new class of Latin hypercube designs. The new designs are compared with several traditional optimal Latin hypercube designs and appear to perform well.  相似文献   

6.
This paper demonstrates how to plan a contingent valuation experiment to assess the value of ecologically produced clothes. First, an appropriate statistical model (the trinomial spike model) that describes the probability that a randomly selected individual will accept any positive bid, and if so, will accept the bid A, is defined. Secondly, an optimization criterion that is a function of the variances of the parameter estimators is chosen. However, the variances of the parameter estimators in this model depend on the true parameter values. Pilot study data are therefore used to obtain estimates of the parameter values and a locally optimal design is found. Because this design is only optimal given that the estimated parameter values are correct, a design that minimizes the maximum of the criterion function over a plausable parameter region (i.e. a minimax design) is then found.  相似文献   

7.
Suppose independent random samples are available from two normal populations with a common mean and unequal variances. Estimation of a quantile of the first population is considered with respect to the quadratic loss. Some new estimators for the quantile are proposed using some previously known estimators of a common mean. Inadmissibility results are proved for estimators which are equivariant under affine and location groups of transformations. Risk values of various estimators of a quantile are compared numerically using a detailed simulation study.  相似文献   

8.
Maximin distance designs are useful for conducting expensive computer experiments. In this article, we compare some global optimization algorithms for constructing such designs. We also introduce several related space-filling designs, including nested maximin distance designs, sliced maximin distance designs, and general maximin distance designs with better projection properties. These designs possess more flexible structures than their analogs in the literature. Examples of these designs constructed by the algorithms are presented.  相似文献   

9.
Single value design optimality criteria are often considered when selecting a response surface design. An alternative to a single value criterion is to evaluate prediction variance properties throughout the experimental region and to graphically display the results in a variance dispersion graph (VDG) (Giovannitti-Jensen and Myers (1989)). Three properties of interest are the spherical average, maximum, and minimum prediction variances. Currently, a computer-intensive optimization algorithm is utilized to evaluate these prediction variance properties. It will be shown that the average, maximum, and minimum spherical prediction variances for central composite designs and Box-Behnken designs can be derived analytically. These three prediction variances can be expressed as functions of the radius and the design parameters. These functions provide exact spherical prediction variance values eliminating the implementation of extensive computing involving algorithms which do not guarantee convergence. This research is concerned with the theoretical development of these analytical forms. Results are presented for hyperspherical and hypercuboidal regions.  相似文献   

10.
Quantile regression has gained increasing popularity as it provides richer information than the regular mean regression, and variable selection plays an important role in the quantile regression model building process, as it improves the prediction accuracy by choosing an appropriate subset of regression predictors. Unlike the traditional quantile regression, we consider the quantile as an unknown parameter and estimate it jointly with other regression coefficients. In particular, we adopt the Bayesian adaptive Lasso for the maximum entropy quantile regression. A flat prior is chosen for the quantile parameter due to the lack of information on it. The proposed method not only addresses the problem about which quantile would be the most probable one among all the candidates, but also reflects the inner relationship of the data through the estimated quantile. We develop an efficient Gibbs sampler algorithm and show that the performance of our proposed method is superior than the Bayesian adaptive Lasso and Bayesian Lasso through simulation studies and a real data analysis.  相似文献   

11.
In this article, we propose a resampling method based on perturbing the estimating functions to compute the asymptotic variances of quantile regression estimators under missing at random condition. We prove that the conditional distributions of the resampling estimators are asymptotically equivalent to the distributions of quantile regression estimators. Our method can deal with complex situations, where the response and part of covariates are missing. Numerical results based on simulated and real data are provided under several designs.  相似文献   

12.
ABSTRACT

The Tukey's gh distribution is widely used in situations where skewness and elongation are important features of the data. As the distribution is defined through a quantile transformation of the normal, the likelihood function cannot be written in closed form and exact maximum likelihood estimation is unfeasible. In this paper we exploit a novel approach based on a frequentist reinterpretation of Approximate Bayesian Computation for approximating the maximum likelihood estimates of the gh distribution. This method is appealing because it only requires the ability to sample the distribution. We discuss the choice of the input parameters by means of simulation experiments and provide evidence of superior performance in terms of Root-Mean-Square-Error with respect to the standard quantile estimator. Finally, we give an application to operational risk measurement.  相似文献   

13.
The paper investigates random processes of geometrical objects in Euclidean spaces. General properties of the measure of total projections are derived by means of Palm distribution. Explicit formulas for variances of the projection measure are obtained for Poisson point processes of compact sets.

Intensity estimators of fibre (surface) processes are then studied by means of projection measures. Classification of direct and indirect probes is introduced. The indirect sampling design of vertical sections and projections is generalized and its statistical properties derived.  相似文献   

14.
Monotonic transformations of explanatory continuous variables are often used to improve the fit of the logistic regression model to the data. However, no analytic studies have been done to study the impact of such transformations. In this paper, we study invariant properties of the logistic regression model under monotonic transformations. We prove that the maximum likelihood estimates, information value, mutual information, Kolmogorov–Smirnov (KS) statistics, and lift table are all invariant under certain monotonic transformations.  相似文献   

15.
We formulate in a reasonable sense a class of optimality functionals for comparing feasible statistical designs available in a given setup. It is desired that the optimality functionals reflect symmetric measures of the lack of information contained in the designs being compared. In view of this, Kiefer's (1975) universal optimality criterion is seen to rest on stringent conditions, some of which can be relaxed while preserving optimality (in an extended sense) of the so-called balanced designs.  相似文献   

16.
Experiments that study complex real world systems in business, engineering and sciences can be conducted at different levels of accuracy or sophistication. Nested space-filling designs are suitable for such multi-fidelity experiments. In this paper, we propose a systematic method to construct nested space-filling designs for experiments with two levels of accuracy. The method that makes use of nested difference matrices can be easily performed, many nested space-filling designs for experiments with two levels of accuracy can thus be constructed, and the resulting designs achieve stratification in low dimensions. In addition, the proposed method can also be used to obtain sliced space-filling designs for conducting computer experiments with both qualitative and quantitative factors.  相似文献   

17.
Problems involving high-dimensional data, such as pattern recognition, image analysis, and gene clustering, often require a preliminary step of dimension reduction before or during statistical analysis. If one restricts to a linear technique for dimension reduction, the remaining issue is the choice of the projection. This choice can be dictated by desire to maximize certain statistical criteria, including variance, kurtosis, sparseness, and entropy, of the projected data. Motivations for such criteria comes from past empirical studies of statistics of natural and urban images. We present a geometric framework for finding projections that are optimal for obtaining certain desired statistical properties. Our approach is to define an objective function on spaces of orthogonal linear projections—Stiefel and Grassmann manifolds, and to use gradient techniques to optimize that function. This construction uses the geometries of these manifolds to perform the optimization. Experimental results are presented to demonstrate these ideas for natural and facial images.  相似文献   

18.
Abstract

In this article, we propose a new regression method called general composite quantile regression (GCQR) which releases the unrealistic finite error variance assumption being imposed by the traditional least squares (LS) method. Unlike the recently proposed composite quantile regression (CQR) method, our proposed GCQR allows any continuous non-uniform density/weight function. As a result, determination of the number of uniform quantile positions is not required. Most importantly, the proposed GCQR criterion can be readily transformed to a linear programing problem, which substantially reduces the computing time. Our theoretical and empirical results show that the GCQR is generally efficient than the CQR and LS if the weight function is appropriately chosen. The oracle properties of the penalized GCQR are also provided. Our simulation results are consistent with the derived theoretical findings. A real data example is analyzed to demonstrate our methodologies.  相似文献   

19.
Two-level designs are useful to examine a large number of factors in an efficient manner. It is typically anticipated that only a few factors will be identified as important ones. The results can then be reanalyzed using a projection of the original design, projected into the space of the factors that matter. An interesting question is how many intrinsically different type of projections are possible from an initial given design. We examine this question here for the Plackett and Burman screening series with N= 12, 20 and 24 runs and projected dimensions k≤5. As a characterization criterion, we look at the number of repeat and mirror-image runs in the projections. The idea can be applied toany two-level design projected into fewer dimensions.  相似文献   

20.
A maximum estimability (maxest) criterion is proposed for design classification and selection. It is an extension and refinement of Webb's resolution criterion for general factorial designs. By using the estimability vector associated with the maxest criterion, projective properties of nonregular designs are studied from the estimability perspective. Comparisons with other criteria are also discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号