首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 312 毫秒
1.
Chen (1999) proposed an economic design, using Taguchi's quality loss function, for choosing a producer's lower specification limit eta for a product with a quality characteristic that has an exponential distribution with mean θ and 'the larger the better' tolerance. Chen (1999) developed an approximate solution that is applicable when 0.5 r m /θ r 0.7 and that requires numerical minimization. We derive a simple, exact solution that is applicable for all values of m /θ and does not require numerical minimization.  相似文献   

2.
In this paper, an optimization model is developed for the economic design of a rectifying inspection sampling plan in the presence of two markets. A product with a normally distributed quality characteristic with unknown mean and variance is produced in the process. The quality characteristic has a lower specification limit. The aim of this paper is to maximize the profit, which consists the Taguchi loss function, under the constraints of satisfying the producer's and consumer's risk in two different markets simultaneously. Giveaway cost per unit of sold excess material is considered in the proposed model. A case study is presented to illustrate the application of proposed methodology. In addition, sensitivity analysis is performed to study the effect of model parameters on the expected profit and optimal solution. Optimal process adjustment problem and acceptance sampling plan is combined in the economical optimization model. Also, process mean and standard deviation are assumed to be unknown value, and their impact is analyzed. Finally, inspection error is considered, and its impact is investigated and analyzed.  相似文献   

3.
Process capability indices (PCIs) are extensively used in the manufacturing industries in order to confirm whether the manufactured products meet their specifications or not. PCIs can be used to judge the process precision, process accuracy, and the process performance. So developing of sampling plans based on PCIs is inevitable and those plans will be very much useful for maintaining and improving the product quality in the manufacturing industries. In view of this, we propose a variables sampling system based on the process capability index Cpmk, which takes into account of process yield and process loss, when the quality characteristic under study will have double specification limits. The proposed sampling system will be effective in compliance testing. The advantages of this system over the existing sampling plans are also discussed. In order to determine the optimal parameters, tables are also constructed by formulating the problem as a nonlinear programming in which the average sample number is minimized by satisfying the producer and consumer risks.  相似文献   

4.
This paper proposes a variables quick switching system where the quality characteristic of interest follows a normal distribution and the quality characteristic is evaluated through a process loss function. Most of the variables sampling plans available in the literature focus only on the fraction non-conforming and those plans do not distinguish between the products that fall within the specification limits. The products that fall within specification limits may not be good if their mean is too away from the target value. So developing a sampling plan by considering process loss is inevitable in these situations. Based on this idea, we develop a variables quick switching system based on the process loss function for the application of the processes requiring low process loss. Tables are also constructed for the selection of parameters of variables quick switching system for given acceptable quality level and limiting quality level. The results are explained with examples.  相似文献   

5.
Specification limit under a quality loss function   总被引:1,自引:0,他引:1  
The purpose of this paper is to present the problem of selecting a lower specification limit under Taguchi's quality loss function. Considering that the product quality characteristic obeys an exponential distribution, we propose a modification of the method of Kapur and Wang for the economic design of the specification limit.  相似文献   

6.
Products that do not meet the specification criteria of an intended buyer represent a challenge to the producer in maximizing profits. To understand the value of the optimal process target (OPT) set at a profit-maximizing level, a model was developed by Shao et al. (1999) involving multiple markets and finished products having holding costs independent from their quality. Investigation in cases considered previously has involved holding costs as a fixed amount or as a normal random variable independent of the quality characteristic (QC) of the product. Less specific in nature, this study considers more general cases in which the HC can be a truncated normal random variable, which is dependent on the QC of the product.  相似文献   

7.
Pulak and Al-Sultan presented a rectifying inspection plan applying in the determination of optimum process mean. However, they did not point out whether the non-conforming items in the sample of accepted lot are replaced or eliminated from the lot and neglected the quality loss within specification limits. In this paper, we further propose the modified Pulak and Al-Sultan model with quadratic quality loss function. There are four cases considered in the modified model: (1) the non-conforming items in the sample of accepted lot are neither replaced nor eliminated from the lot; (2) the non-conforming items in the sample of accepted lot are not replaced but are eliminated from the lot; (3) the non-conforming items in the sample of accepted lot are replaced by conforming ones; (4) the non-conforming items in the sample of accepted lot are replaced by non-inspected items. The numerical results and sensitivity analysis of parameters show that their solutions are slightly different.  相似文献   

8.
Hotelling's T 2 test is known to be optimal under multivariate normality and is reasonably validity-robust when the assumption fails. However, some recently introduced robust test procedures have superior power properties and reasonable type I error control with non-normal populations. These, including the tests due to Tiku & Singh (1982), Tiku & Balakrishnan (1988) and Mudholkar & Srivastava (1999b, c), are asymptotically valid but are useful with moderate size samples only if the population dimension is small. A class of B-optimal modifications of the stepwise alternatives to Hotellings T 2 introduced by Mudholkar & Subbaiah (1980) are simple to implement and essentially equivalent to the T 2 test even with small samples. In this paper we construct and study the robust versions of these modified stepwise tests using trimmed means instead of sample means. We use the robust one- and two-sample trimmed- t procedures as in Mudholkar et al. (1991) and propose statistics based on combining them. The results of an extensive Monte Carlo experiment show that the robust alternatives provide excellent type I error control and a substantial gain in power.  相似文献   

9.
A common task in quality control is to determine a control limit for a product at the time of release that incorporates its risk of degradation over time. Such a limit for a given quality measurement will be based on empirical stability data, the intended shelf life of the product and the stability specification. The task is particularly important when the registered specifications for release and stability are equal. We discuss two relevant formulations and their implementations in both a frequentist and Bayesian framework. The first ensures that the risk of a batch failing the specification is comparable at release and at the end of shelf life. The second is to screen out batches at release time that are at high risk of failing the stability specification at the end of their shelf life. Although the second formulation seems more natural from a quality assurance perspective, it usually renders a control limit that is too stringent. In this paper we provide theoretical insight in this phenomenon, and introduce a heat-map visualisation that may help practitioners to assess the feasibility of implementing a limit under the second formulation. We also suggest a solution when infeasible. In addition, the current industrial benchmark is reviewed and contrasted to the two formulations. Computational algorithms for both formulations are laid out in detail, and illustrated on a dataset.  相似文献   

10.
Taguchi (1984,1987) has derived tolerances for subsystems, subcomponents, parts and materials. However, he assumed that the relationship between a higher rank and a lower rank quality characteristic is deterministic. The basic structure of the above tolerance design problem is very similar to that of the screening problem. Tang (1987) proposed three cost models and derived an economic design for the screening problem of “the-bigger-the-better” quality characteristic in which the optimal specification limit ( or tolerance ) for a screening variable ( or a lower rank quality characteristic ) was obtained by minimizing the expected total cost function.Tang considered that the quality cost incurred only when the quality characteristic is out of specification while Taguchi considered that the quality cost incurred whenever the quality characteristic deviates from its nominal value. In this paper, a probabilistic relationship, namely, a bivariate normal distribution between the above two qualy characteristics as in a screening problem as well as Taguchi's quadratic loss function are considered together to develop a closed form solution of the tolerance design for a subsystem.  相似文献   

11.
This paper explores the problem of minimizing the average total inspection (ATI) of Read & Beattie's variable lot-size sampling plan for continuous production. The solution procedure is developed to find the optimal parameters (n,  c) that will meet the average outgoing quality limit (AOQL) requirement, while also minimizing the ATI for this variable lot-size plan.  相似文献   

12.
In this paper, Duncan's cost model combined Taguchi's quadratic loss function is applied to develop the economic-statistical design of the sum of squares exponentially weighted moving average (SS-EWMA) chart. The genetic algorithm is applied to search for the optimal decision variables of SS-EWMA chart such that the expected cost is minimized. Sensitivity analysis reveals that the optimal sample size and sampling interval decrease; optimal smoothing constant and control limit increase as the mean and/or variance increases. Moreover, the combination of optimal parameter levels in orthogonal array experiment plays an important guideline for monitoring the process mean and/or variance.  相似文献   

13.
We discuss the functional central limit theorem (FCLT) for the empirical process of a moving-average stationary sequence with long memory. The cases of one-sided and double-sided moving averages are discussed. In the case of one-sided (causal) moving average, the FCLT is obtained under weak conditions of smoothness of the distribution and the existence of (2+δ)-moment of i.i.d. innovations, by using the martingale difference decomposition due to Ho and Hsing (1996, Ann. Statist. 24, 992–1014). In the case of double-sided moving average, the proof of the FCLT is based on an asymptotic expansion of the bivariate probability density.  相似文献   

14.

Amin et al. (1999) developed an exponentially weighted moving average (EWMA) control chart, based on the smallest and largest observations in each sample. The resulting plot of the extremes suggests that the MaxMin EWMA may also be viewed as smoothed tolerance limits. Tolerance limits are limits that include a specific proportion of the population at a given confidence level. In the context of process control, they are used to make sure that production will not be outside specifications. Amin and Li (2000) provided the coverages of the MaxMin EWMA tolerance limits for independent data. In this article, it is shown how autocorrelation affects the confidence level of MaxMin tolerance limits, for a specified level of coverage of the population, and modified smoothed tolerance limits are suggested for autocorrelated processes.  相似文献   

15.
Recent research has shown that the control charts with adaptive features are quicker than the traditional static Shewhart charts in detecting process shifts. This article presents the design and implementation of a control chart based on Adjusted Loss Function (AL) with Variable Sample Sizes and Sampling Intervals (VSSI). This single chart (called the VSSI AL chart) is able to monitor the process shifts in mean and variance simultaneously. Our studies show that the VSSI AL chart is not only easier to design and implement than the VSSI X¯ & S (or X¯ & R) charts, but is also 10% more effective than the latter in detecting the process shifts from an overall viewpoint.  相似文献   

16.
The article addresses a real-life problem on determining the optimum sampling interval for control of defective items in a hot rolling mill. Having observed that the pattern of appearance of mill defects indicates a geometric process failure mechanism, an economic model is developed in line with the method suggested by Taguchi and critically examined by Nayebpour & Woodall. An expression for the expected loss per product as a function of the sampling interval is derived and the optimum interval is obtained by minimizing this loss function. The practical issues involved in this exercise, such as estimation of various cost components, are also discussed and the effect of erroneous estimation of cost components is studied through a sensitivity analysis.  相似文献   

17.
Summary We deal with double sampling plans by variables for a one-sided specification limit when the quality characteristic is normally distributed with unknown standard deviation. An algorithm is presented that allows to calculate the OC of the sampling plans proposed by Bowker and Goode (1952). We give several examples. Furthermore, it is shown that the algorithm carries over to calculating the OC of the double-stage t-test. The authors wish to thank Yvonne K?llner and Timor Saffary for technical support.  相似文献   

18.
For the implementation of an acceptance sampling plan, a problem the quality practitioners have to deal with is the determination of the critical acceptance values and inspection sample sizes that provide the desired levels of protection to both vendors and buyers. Traditionally, most acceptance sampling plans focus on the percentage of defective products instead of considering the process loss, which doesn't distinguish among the products that fall within the specification limits. However, the quality between products that fall within the specification limits may be very different. So how to design an acceptance sampling plan with process loss consideration is necessary. In this article, a variables sampling plan based on L e is proposed to handle processes requiring low process loss. The required sample sizes n and the critical acceptance value c with various combination of acceptance quality level are tabulated. The proposed sampling plan provides a feasible policy, which can be applied to products requiring low process loss where classical sampling plans cannot be applied.  相似文献   

19.
Dodge (1943) introduced a single level attribute continuous sampling plan designated as CSP-1 for the application of continuous production processes. Govindaraju & Kandasamy (2000) developed a new single level continuous sampling plan whose sampling inspection phase is characterized by a maximum allowable number of non-conforming units c, and a constant sampling rate f and was designated as CSP-C. In this paper, a modification is proposed on the CSP-C continuous sampling plan. In this modified plan, sampling inspection is continued until the occurrence of c+1 non-conforming units, provided the first m sampled units have been found conforming during the sampling phase. Using a Markov chain model, expressions for the performance measures of the modified CSP-C plan are derived. The main advantage of the modified plan is that it is possible to lower the average outgoing quality limit.  相似文献   

20.
The rapid response to the requirements of customers and markets promotes the concurrent engineering (CE) technique in product and process design. The decision making for process quality target, SPC method, sampling plan, and control chart parameter design can be done at the stage of process quality plan based on historical data and process knowledge database. Therefore, it is a reasonable trend to introduce the concepts and achievements on process quality evaluation and process capability analysis, CE, and SPC techniques into process plan and tolerance design. A new systematic method for concurrent design of process quality, statistical tolerance (ST), and control chart is presented based on a NSFC research program. A set of standardized process quality indices (PQIs) for variables is introduced for meeting the measurement and evaluation to process yield, process centering, and quality loss. This index system that has relatively strong compatibility and adaptability is based on raisonne grading by using the series of preferred numbers and arithmetical progression. The expected process quality based on this system can be assured by a standardized interface between PQIs and SPC, that is, quality-oriented statistical tolerance zone. A quality-oriented ST and SPC approach that quantitatively specifies what a desired process is and how to assure it will realize the optimal control for a process toward a predetermined quality target.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号