首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Process capability indices have been widely used in the manufacturing industry providing numerical measures on process performance. The index Cp provides measures on process precision (or product consistency). The index Cpm, sometimes called the Taguchi index, meditates on process centring ability and process loss. Most research work related to Cp and Cpm assumes no gauge measurement errors. This assumption insufficiently reflects real situations even with highly advanced measuring instruments. Conclusions drawn from process capability analysis are therefore unreliable and misleading. In this paper, we conduct sensitivity investigation on process capability Cp and Cpm in the presence of gauge measurement errors. Due to the randomness of variations in the data, we consider capability testing for Cp and Cpm to obtain lower confidence bounds and critical values for true process capability when gauge measurement errors are unavoidable. The results show that the estimator with sample data contaminated by the measurement errors severely underestimates the true capability, resulting in imperceptible smaller test power. To obtain the true process capability, adjusted confidence bounds and critical values are presented to practitioners for their factory applications.  相似文献   

2.
Process capability indices have been widely used to evaluate the process performance to the continuous improvement of quality and productivity. The distribution of the estimator of the process capability index C pmk is very complicated and the asymptotic distribution is proposed by Chen and Hsu [The asymptotic distribution of the processes capability index C pmk , Comm. Statist. Theory Methods 24(5) (1995), pp. 1279–1291]. However, we found a critical error for the asymptotic distribution when the population mean is not equal to the midpoint of the specification limits. In this paper, a correct version of the asymptotic distribution is given. An asymptotic confidence interval of C pmk by using the correct version of asymptotic distribution is proposed and the lower bound can be used to test if the process is capable. A simulation study of the coverage probability of the proposed confidence interval is shown to be satisfactory. The relation of six sigma technique and the index C pmk is also discussed in this paper. An asymptotic testing procedure to determine if a process is capable based on the index of C pmk is also given in this paper.  相似文献   

3.
Abstract

The use of indices as an estimation tool of process capability is long-established among the statistical quality professionals. Numerous capability indices have been proposed in last few years. Cpm constitutes one of the most widely used capability indices and its estimation has attracted much interest. In this paper, we propose a new method for constructing an approximate confidence interval for the index Cpm. The proposed method is based on the asymptotic distribution of the index Cpm obtained by the Delta Method. Under some regularity conditions, the distribution of an estimator of the process capability index Cpm is asymptotically normal.  相似文献   

4.
The process capability index C pm, sometimes called the loss-based index, has been proposed to the manufacturing industry for measuring process reproduction capability. This index incorporates the variation of production items with respect to the target value and the specification limits preset in the factory. To estimate the loss-based index properly and accurately, certain frequentist and Bayesian perspectives have been proposed to obtain lower confidence bounds (LCBs) for providing minimum process capability. The LCBs not only provide critical information regarding process performance but are also used to determine whether an improvement was made in a capability index and by extension in reducing the fraction of non-conforming items. In this paper, under the assumption of normality, based on frequentist and Bayesian senses, several existing approaches for constructing LCBs of C pm are presented. Depending on the statistical methods used, we then classify these existing approaches into three categories and compared them in terms of the coverage rates and the mean values of the LCBs via simulations. The relative advantages and disadvantages of these approaches are summarized with some highlights of the relevant findings.  相似文献   

5.
The main result of this paper is that under some regularity conditions, the distribution of an estimator of the process capability index Cpmk is asymptotically normal.  相似文献   

6.
Perakis and Xekalaki 2002, A process capability index that is based on the proportion of conformance. Journal of Statistical Computation and Simulation, 72(9), 707–718. introduced a process capability index that is based on the proportion of conformance of the process under study and has several appealing features. One of its advantages is that it can be used not only for continuous processes, as is the case with the majority of the indices considered in the literature, but also for discrete processes as well. In this article, the use of this index is investigated for discrete data under two alternative models, which are frequently considered in statistical process control. In particular, distributional properties and estimation of the index are considered for Poisson processes and for processes resulting in modeling attribute data. The performance of the suggested estimators and confidence limits is tested via simulation.  相似文献   

7.
In this paper, we present an optimal designing methodology for the skip-lot sampling plan (SkSP) of type SkSP-R based on the most widely used process capability index (PCI) Cpk. The SkSP-R plan is one of the SkSPs that incorporate the provision of the reinspection concept. In order to design the optimal parameters, we consider both symmetric and asymmetric fraction non conforming cases. Tables are also constructed to determine the optimal parameters by formulating an optimization problem. The advantages of the proposed plan over the existing plan are also discussed. Application of the plan is explained with a real-life example.  相似文献   

8.
Franklin and Wasserman (1991) introduced the use of Bootstrap sampling procedures for deriving nonparametric confidence intervals for the process capability index, Cpk, which are applicable for instances when at least twenty data points are available. This represents a significant reduction in the usually recommended sample requirement of 100 observations (see Gunther 1989). To facilitate and encourage the use of these procedures. a FORTRAN program is provided for computation of confidence intervals for Cpk. Three methods are provided for this calculation including the standard method, the percentile confidence interval, and the biased - corrected percentile confidence interval.  相似文献   

9.
Several sampling distribution properties of the estimator for Cpk. are presented under the assumption that the data are normal, independent and identically distributed. In particular, the expectation, variance and skewness are derived. Since the sampling distribution is only weakly skewed, we concluded that a symmetric interval estimator for Cpk . might be reasonable. We developed such a symmetric interval estimator and conducted a simulation study to explore its coverage probabilities.  相似文献   

10.
In this article, we propose a new mixed chain sampling plan based on the process capability index Cpk, where the quality characteristic of interest having double specification limits and follows the normal distribution with unknown mean and variance. In the proposed mixed plan, the chain sampling inspection plan is used for the inspection of attribute quality characteristics. The advantages of this proposed mixed sampling plan are also discussed. Tables are constructed to determine the optimal parameters for practical applications by formulating the problem as a non linear programming in which the objective function to be minimized is the average sample number and the constraints are related to lot acceptance probabilities at acceptable quality level and limiting quality level under the operating characteristic curve. The practical application of the proposed mixed sampling plan is explained with an illustrative example. Comparison of the proposed sampling plan is also made with other existing sampling plans.  相似文献   

11.
With the advent of modern technology, manufacturing processes have become very sophisticated; a single quality characteristic can no longer reflect a product's quality. In order to establish performance measures for evaluating the capability of a multivariate manufacturing process, several new multivariate capability (NMC) indices, such as NMC p and NMC pm , have been developed over the past few years. However, the sample size determination for multivariate process capability indices has not been thoroughly considered in previous studies. Generally, the larger the sample size, the more accurate an estimation will be. However, too large a sample size may result in excessive costs. Hence, the trade-off between sample size and precision in estimation is a critical issue. In this paper, the lower confidence limits of NMC p and NMC pm indices are used to determine the appropriate sample size. Moreover, a procedure for conducting the multivariate process capability study is provided. Finally, two numerical examples are given to demonstrate that the proper determination of sample size for multivariate process indices can achieve a good balance between sampling costs and estimation precision.  相似文献   

12.
    
Process capability index Cpk has been the most popular one used in the manufacturing industry dealing with problems of measuring reproduction capability of processes to enhance product development with very low fraction of defectives. In the manufacturing industry, lower confidence bound (LCB) estimates the minimum process capability providing pivotal information for quality engineers to monitoring the process and assessing process performance for quality assurance. The main objective of this paper is to compare and contrast the LCBs on Cpk using two approaches, Classical method and Bayesian method.  相似文献   

13.
We discuss Bayesian analyses of traditional normal-mixture models for classification and discrimination. The development involves application of an iterative resampling approach to Monte Carlo inference, commonly called Gibbs sampling, and demonstrates routine application. We stress the benefits of exact analyses over traditional classification and discrimination techniques, including the ease with which such analyses may be performed in a quite general setting, with possibly several normal-mixture components having different covariance matrices, the computation of exact posterior classification probabilities for observed data and for future cases to be classified, and posterior distributions for these probabilities that allow for assessment of second-level uncertainties in classification.  相似文献   

14.
Most of the research effort concerning the development and statistical study of capability indices has been devoted to normal processes. In this paper a statistical study of a capability index for non-normal processes proposed by Clements (1989) is developed. An approximate distribution for the natural estimator of the index is obtained from a distribution free point of view and a simulation study is used to compare it with its empirical distribution. An approximate conservative lower confidence limit for the index is also constructed.  相似文献   

15.
In this paper, order statistics from independent and non identically distributed random variables is used to obtain ordered ranked set sampling (ORSS). Bayesian inference of unknown parameters under a squared error loss function of the Pareto distribution is determined. We compute the minimum posterior expected loss (the posterior risk) of the derived estimates and compare them with those based on the corresponding simple random sample (SRS) to assess the efficiency of the obtained estimates. Two-sample Bayesian prediction for future observations is introduced by using SRS and ORSS for one- and m-cycle. A simulation study and real data are applied to show the proposed results.  相似文献   

16.
Under a natural conjugate prior with four hyperparameters, the importance sampling (IS) technique is applied to the Bayesian analysis of the power law process (PLP). Samples of the parameters of the PLP are obtained from IS. Based on these samples, not only the posterior analysis of parameters and some parameter functions in the PLP are performed conveniently, but also single-sample and two-sample prediction procedures are constructed easily. Furthermore, the sensitivity of the posterior mean of the parameter functions in the PLP is studied with respect to the hyperparameters of the natural conjugate prior and it can guide the selections of the hyperparameters directly. Coupled this sensitivity with the relations between the prior moments and the hyperparameters in the natural conjugate prior, it is possible to give directions about the selections of the prior moments to a certain degree. After some numerical experiments illustrate the rationality and feasibility of the proposed methods, an engineering example demonstrates its application.  相似文献   

17.
In this paper an attempt has been made to examine the multivariate versions of the common process capability indices (PCI's) denoted by Cp and Cpk . Markov chain Monte Carlo (MCMC) methods are used to generate sampling distributions for the various PCI's from where inference is performed. Some Bayesian model checking techniques are developed and implemented to examine how well our model fits the data. Finally the methods are exemplified on a historical aircraft data set collected by the Pratt and Whitney Company.  相似文献   

18.
Summary.  We discuss a method for combining different but related longitudinal studies to improve predictive precision. The motivation is to borrow strength across clinical studies in which the same measurements are collected at different frequencies. Key features of the data are heterogeneous populations and an unbalanced design across three studies of interest. The first two studies are phase I studies with very detailed observations on a relatively small number of patients. The third study is a large phase III study with over 1500 enrolled patients, but with relatively few measurements on each patient. Patients receive different doses of several drugs in the studies, with the phase III study containing significantly less toxic treatments. Thus, the main challenges for the analysis are to accommodate heterogeneous population distributions and to formalize borrowing strength across the studies and across the various treatment levels. We describe a hierarchical extension over suitable semiparametric longitudinal data models to achieve the inferential goal. A nonparametric random-effects model accommodates the heterogeneity of the population of patients. A hierarchical extension allows borrowing strength across different studies and different levels of treatment by introducing dependence across these nonparametric random-effects distributions. Dependence is introduced by building an analysis of variance (ANOVA) like structure over the random-effects distributions for different studies and treatment combinations. Model structure and parameter interpretation are similar to standard ANOVA models. Instead of the unknown normal means as in standard ANOVA models, however, the basic objects of inference are random distributions, namely the unknown population distributions under each study. The analysis is based on a mixture of Dirichlet processes model as the underlying semiparametric model.  相似文献   

19.
In this paper we consider the problems of estimation and prediction when observed data from a lognormal distribution are based on lower record values and lower record values with inter-record times. We compute maximum likelihood estimates and asymptotic confidence intervals for model parameters. We also obtain Bayes estimates and the highest posterior density (HPD) intervals using noninformative and informative priors under square error and LINEX loss functions. Furthermore, for the problem of Bayesian prediction under one-sample and two-sample framework, we obtain predictive estimates and the associated predictive equal-tail and HPD intervals. Finally for illustration purpose a real data set is analyzed and simulation study is conducted to compare the methods of estimation and prediction.  相似文献   

20.
We address the problem of the curtailment or continuation of an experiment or trial at some interim point where say N observations are in hand and at least S > N observations had originally been scheduled for a decision. A Bayesian predictive approach is used to determine the probability that if one continued the trial with a further sample of size M where N +MS, one would come to a particular decision regarding a parameter or a future observable. This point of view can also be applied to significance tests if one is willing to admit the calculation as a subjective assessment.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号