首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Duncan's economic model of Shewhart's original x¯ chart has established its optimal and economic application for processes with the Markovian failure characteristic. As the sample statistics show some indications of process variations, the variable-sampling-interval (VSI) control charts perform more effectively than the fixed sampling interval (FSI) ones due to a higher frequency in the sampling rate. Regarding the economic design of control charts, most studies have been dedicated to the FSI scheme. In 1998, Bai & Lee considered the production process with a single assignable cause and proposed an economic VSI design for a general x¯ control chart. However, in real cases, there are multiple assignable causes in the production process. Therefore, concerning the operation characteristics of the real industry, this research develops an economic model for the VSI control chart with multiple assignable causes based on stochastic and statistics theory and determines the optimal design parameters of the chart. A numerical example is also provided to demonstrate the effectiveness of the proposed model and the result indicates that VSI performs more effectively than a FSI control chart.  相似文献   

2.
Srivastava (1980) has shown that Grubbs's (1950) test for a univariate outlier is robust against the effect of equicorrelation. In this note we extend Srivastava's result by giving a more general covariance structure, which relaxes both the covariance structure and the assumption of equal variances. We also show that under the more general covariance structure, the power of Grubbs's test, as well as the significance level, is identical to the independently and identically distributed case.  相似文献   

3.
We consider a general design that allows information for different patterns, or sets, of data items to be collected from different sample units, which we call a Split Questionnaire Design (SQD). While SQDs have been historically used to accommodate constraints on respondent burden, this paper shows they can also be an efficient design option. The efficiency of a design can be measured by the cost required to meet constraints on the accuracy of estimates. Moreover, this paper shows how an SQD provides considerable flexibility when exploring the balance between the design's efficiency and the burden it places on respondents. The targets of interest to the design are analytic parameters, such as regression coefficients. Empirical results show that SQDs are worthwhile considering.  相似文献   

4.
ABSTRACT

Traditional studies on optimal designs for ANOVA parameter estimation are based on the framework of equal probabilities of appearance for each factor's levels. However, this premise does not hold in a variety of experimental problems, and it is of theoretical and practical interest to investigate optimal designs for parameters with unequal appearing odds. In this paper, we propose a general orthogonal design via matrix image, in which all columns’ matrix images are orthogonal with each other. Our main results show that such designs have A- and E-optimalities on the estimation of ANOVA parameters which have unequal appearing odds. In addition, we develop two simple methods to construct the proposed designs. The optimality of the design is also validated by a simulation study.  相似文献   

5.
With a growing interest in using non-representative samples to train prediction models for numerous outcomes it is necessary to account for the sampling design that gives rise to the data in order to assess the generalized predictive utility of a proposed prediction rule. After learning a prediction rule based on a non-uniform sample, it is of interest to estimate the rule's error rate when applied to unobserved members of the population. Efron (1986) proposed a general class of covariance penalty inflated prediction error estimators that assume the available training data are representative of the target population for which the prediction rule is to be applied. We extend Efron's estimator to the complex sample context by incorporating Horvitz–Thompson sampling weights and show that it is consistent for the true generalization error rate when applied to the underlying superpopulation. The resulting Horvitz–Thompson–Efron estimator is equivalent to dAIC, a recent extension of Akaike's information criteria to survey sampling data, but is more widely applicable. The proposed methodology is assessed with simulations and is applied to models predicting renal function obtained from the large-scale National Health and Nutrition Examination Study survey. The Canadian Journal of Statistics 48: 204–221; 2020 © 2019 Statistical Society of Canada  相似文献   

6.
Emmanuel Caron 《Statistics》2019,53(4):885-902
In this paper, we consider the usual linear regression model in the case where the error process is assumed strictly stationary. We use a result from Hannan (Central limit theorems for time series regression. Probab Theory Relat Fields. 1973;26(2):157–170), who proved a Central Limit Theorem for the usual least squares estimator under general conditions on the design and on the error process. Whatever the design satisfying Hannan's conditions, we define an estimator of the covariance matrix and we prove its consistency under very mild conditions. As an application, we show how to modify the usual tests on the linear model in this dependent context, in such a way that the type-I error rate remains asymptotically correct, and we illustrate the performance of this procedure through different sets of simulations.  相似文献   

7.
We call a sample design that allows for different patterns, or sets, of data items to be collected from different sample units a Split Questionnaire Design (SQD). SQDs can be thought of as incorporating missing data into survey design. This paper examines the situation where data that are not collected by an SQD can be treated as Missing Completely At Random or Missing At Random, targets are regression coefficients in a generalised linear model fitted to binary variables, and targets are estimated using Maximum Likelihood. A key finding is that it can be easy to measure the relative contribution of a respondent to the accuracy of estimated model parameters before collecting all the respondent's model covariates. We show empirically and theoretically that we could achieve a significant reduction in respondent burden with a negligible impact on the accuracy of estimates by not collecting model covariates from respondents who we identify as contributing little to the accuracy of estimates. We discuss the general implications for SQDs.  相似文献   

8.
Simpson's paradox is a challenging topic to teach in an introductory statistics course. To motivate students to understand this paradox both intuitively and statistically, this article introduces several new ways to teach Simpson's paradox. We design a paper toss activity between instructors and students in class to engage students in the learning process. We show that Simpson's paradox widely exists in basketball statistics, and thus instructors may consider looking for Simpson's paradox in their own school basketball teams as examples to motivate students’ interest. A new probabilistic explanation of Simpson's paradox is provided, which helps foster students’ statistical understanding. Supplementary materials for this article are available online.  相似文献   

9.
This article estimates the speed of the adjustment coefficient in structural error-correction models. We use a system method for real exchange rates of traded and nontraded goods by combining a single-equation method with Hansen and Sargent's instrumental variables methods for linear rational expectations models. We apply these methods to a modified version of Mussa's model. Our results show that the half-lives of purchasing power parity deviations for the rates of traded goods are less than 1 year and are shorter than those for general price and for nontraded goods in most cases, implying a faster adjustment speed to parity.  相似文献   

10.
Existing projection designs (e.g. maximum projection designs) attempt to achieve good space-filling properties in all projections. However, when using a Gaussian process (GP), model-based design criteria such as the entropy criterion is more appropriate. We employ the entropy criterion averaged over a set of projections, called expected entropy criterion (EEC), to generate projection designs. We show that maximum EEC designs are invariant to monotonic transformations of the response, i.e. they are optimal for a wide class of stochastic process models. We also demonstrate that transformation of each column of a Latin hypercube design (LHD) based on a monotonic function can substantially improve the EEC. Two types of input transformations are considered: a quantile function of a symmetric Beta distribution chosen to optimize the EEC, and a nonparametric transformation corresponding to the quantile function of a symmetric density chosen to optimize the EEC. Numerical studies show that the proposed transformations of the LHD are efficient and effective for building robust maximum EEC designs. These designs give projections with markedly higher entropies and lower maximum prediction variances (MPV''s) at the cost of small increases in average prediction variances (APV''s) compared to state-of-the-art space-filling designs over wide ranges of covariance parameter values.  相似文献   

11.
A double sampling plan based on truncated life tests is proposed and designed under a general life distribution. The design parameters such as sample sizes and acceptance numbers for the first and the second samples are determined so as to minimize the average sample number subject to satisfying the consumer's and producer's risks at the respectively specified quality levels. The resultant tables can be used regardless of the underlying distribution as long as the reliability requirements are specified at two risks. In addition, Gamma and Weibull distributions are particularly considered to report the design parameters according to the quality levels in terms of the mean ratios.  相似文献   

12.
A stratified study is often designed for adjusting several independent trials in modern medical research. We consider the problem of non-inferiority tests and sample size determinations for a nonzero risk difference in stratified matched-pair studies, and develop the likelihood ratio and Wald-type weighted statistics for testing a null hypothesis of non-zero risk difference for each stratum in stratified matched-pair studies on the basis of (1) the sample-based method and (2) the constrained maximum likelihood estimation (CMLE) method. Sample size formulae for the above proposed statistics are derived, and several choices of weights for Wald-type weighted statistics are considered. We evaluate the performance of the proposed tests according to type I error rates and empirical powers via simulation studies. Empirical results show that (1) the likelihood ratio and the Wald-type CMLE test based on harmonic means of the stratum-specific sample size (SSIZE) weight (the Cochran's test) behave satisfactorily in the sense that their significance levels are much closer to the prespecified nominal level; (2) the likelihood ratio test is better than Nam's [2006. Non-inferiority of new procedure to standard procedure in stratified matched-pair design. Biometrical J. 48, 966–977] score test; (3) the sample sizes obtained by using SSIZE weight are smaller than other weighted statistics in general; (4) the Cochran's test statistic is generally much better than other weighted statistics with CMLE method. A real example from a clinical laboratory study is used to illustrate the proposed methodologies.  相似文献   

13.
In this paper, Yate's missing plot technique is used to derive the formula for substitution in a missing plot in a general incomplete block design, where blocks are assumed to be independent normal. The use of penalized normal equations, using BLUPS, makes this task simpler.  相似文献   

14.
The authors introduce the notion of split generalized wordlength pattern (GWP), i.e., treatment GWP and block GWP, for a blocked nonregular factorial design. They generalize the minimum aberration criterion to suit this type of design. Connections between factorial design theory and coding theory allow them to obtain combinatorial identities that govern the relationship between the split GWP of a blocked factorial design and that of its blocked consulting design. These identities work for regular and nonregular designs. Furthermore, the authors establish general rules for identifying generalized minimum aberration (GMA) blocked designs through their blocked consulting designs. Finally they tabulate and compare some GMA blocked designs from Hall's orthogonal array OA(16,215,2) of type III.  相似文献   

15.
Friedman's test is a widely used rank-based alternative to the analysis of variance (ANOVA) F-test for identifying treatment differences in a randomized complete block design. Many texts provide incomplete or misleading information about when Friedman's test may be appropriately applied. We discuss the assumptions needed for the test and common misconceptions. We show via simulation that when the variance or skew of the treatment distributions differ, application of Friedman's test to detect differences in treatment location can result in Type I error probabilities larger than the nominal α, and even when α is unaffected, the power of the test can be less than expected.  相似文献   

16.
The classical unconditional exact p-value test can be used to compare two multinomial distributions with small samples. This general hypothesis requires parameter estimation under the null which makes the test severely conservative. Similar property has been observed for Fisher's exact test with Barnard and Boschloo providing distinct adjustments that produce more powerful testing approaches. In this study, we develop a novel adjustment for the conservativeness of the unconditional multinomial exact p-value test that produces nominal type I error rate and increased power in comparison to all alternative approaches. We used a large simulation study to empirically estimate the 5th percentiles of the distributions of the p-values of the exact test over a range of scenarios and implemented a regression model to predict the values for two-sample multinomial settings. Our results show that the new test is uniformly more powerful than Fisher's, Barnard's, and Boschloo's tests with gains in power as large as several hundred percent in certain scenarios. Lastly, we provide a real-life data example where the unadjusted unconditional exact test wrongly fails to reject the null hypothesis and the corrected unconditional exact test rejects the null appropriately.  相似文献   

17.
This article assesses the impact that the late W. Edwards Deming had on statistical practice in general and industrial statistics in particular. It provides a direct-from-the-trenches comparison of Deming's goals with their realizations and a commentary on the relevance and challenges of Deming's concepts to today's industrial environment.  相似文献   

18.
In this work, we study D s -optimal design for Kozak's tree taper model. The approximate D s -optimal designs are found invariant to tree size and hence create a ground to construct a general replication-free D s -optimal design. Even though the designs are found not to be dependent on the parameter value p of the Kozak's model, they are sensitive to the s×1 subset parameter vector values of the model. The 12 points replication-free design (with 91% efficiency) suggested in this study is believed to reduce cost and time for data collection and more importantly to precisely estimate the subset parameters of interest.  相似文献   

19.
The Hotelling's T 2 control chart, a direct analogue of the univariate Shewhart chart, is perhaps the most commonly used tool in industry for simultaneous monitoring of several quality characteristics. Recent studies have shown that using variable sampling size (VSS) schemes results in charts with more statistical power when detecting small to moderate shifts in the process mean vector. In this paper, we build a cost model of a VSS T 2 control chart for the economic and economic statistical design using the general model of Lorenzen and Vance [The economic design of control charts: A unified approach, Technometrics 28 (1986), pp. 3–11]. We optimize this model using a genetic algorithm approach. We also study the effects of the costs and operating parameters on the VSS T 2 parameters, and show, through an example, the advantage of economic design over statistical design for VSS T 2 charts, and measure the economic advantage of VSS sampling versus fixed sample size sampling.  相似文献   

20.
Khatri (1982) justified a statement “Among the variance balanced designs, one should choose the design which has the minimum number of plots”, in terms of A-efficiency. Examples of connected block designs are here presented to disprove Khatri's statement. Some general results are also provided to stress our statements.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号