首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 93 毫秒
1.
Recently, tolerance interval approaches to the calculation of a shelf life of a drug product have been proposed in the literature. These address the belief that shelf life should be related to control of a certain proportion of batches being out of specification. We question the appropriateness of the tolerance interval approach. Our concerns relate to the computational challenges and practical interpretations of the method. We provide an alternative Bayesian approach, which directly controls the desired proportion of batches falling out of specification assuming a controlled manufacturing process. The approach has an intuitive interpretation and posterior distributions are straightforward to compute. If prior information on the fixed and random parameters is available, a Bayesian approach can provide additional benefits both to the company and the consumer. It also avoids many of the computational challenges with the tolerance interval methodology.  相似文献   

2.
In quality control, a performance variable having a two-sided specification limit is common for assessing lot quality. Sometimes it is difficult or impossible to measure the performance variable directly; for example, when testing is destructive, expensive, or when the performance variable is related to the lifetime of the product. However, it may happen that there are several concomitant variables which are easily measured and which correlate highly with the variable of interest. Thus, one may use measurements on these variables to select or screen product which will have a high conditional probability of meeting product specification. We consider this situation when all variables have a joint multivariate normal distribution and the specification limits on the performance variable are two-sided.  相似文献   

3.
Specification limit under a quality loss function   总被引:1,自引:0,他引:1  
The purpose of this paper is to present the problem of selecting a lower specification limit under Taguchi's quality loss function. Considering that the product quality characteristic obeys an exponential distribution, we propose a modification of the method of Kapur and Wang for the economic design of the specification limit.  相似文献   

4.
A common practical situation in process capability analysis, which is not well developed theoretically, is when the quality characteristic of interest has a skewed distribution with a long tail towards relatively large values and an upper specification limit only exists. In such situations, it is not uncommon that the smallest possible value of the characteristic is 0 and this is also the best value to obtain. Hence a target value 0 is assumed to exist. We investigate a new class of process capability indices for this situation. Two estimators of the proposed index are studied and the asymptotic distributions of these estimators are derived. Furthermore, we suggest a decision procedure useful when drawing conclusions about the capability at a given significance level, based on the estimated indices and their asymptotic distributions. A simulation study is also performed, assuming that the quality characteristic is Weibull-distributed, to investigate the true significance level when the sample size is finite.  相似文献   

5.
Wen & Mergen (1999) proposed a method for setting the optimal process mean when a process was not capable of meeting specifications in the short term. However, they neglected to consider the quality loss for a product within specifications in the model. Chen & Chou (2002) presented a modified Wen & Mergen's (1999) model, including the quadratic quality loss function for a one-sided specification limit. In this paper, we propose the modified Wen & Mergen (1999) cost model including the linear quality loss function of a product for determining the optimal process mean of a one-sided specification limit.  相似文献   

6.
This article considers designed experiments for stability, comparability, and formulation testing that are analyzed with regression models in which the degradation rate is a fixed effect. In this setting, we investigate how the number of lots, the number of time points and their locations affect the precision of the entities of interest, leverages of the time points, detection of non-linearity and interim analyses. This investigation shows that modifying time point locations suggested by ICH for stability studies can significantly improve these objectives. In addition, we show that estimates of precision can be biased when a regression model that assumes independent measurements is used in the presence of within-assay session correlation. This bias can lead to longer shelf life estimates in stability studies and loss of power in comparability studies. Mixed-effect models that take into account within-assay session correlation are shown to reduce this bias. The findings in this article are obtained from well known statistical theory but provide valuable practical advice to scientists and statisticians designing and interpreting these types of experiments.  相似文献   

7.
In quality control, we may confront imprecise concepts. One case is a situation in which upper and lower specification limits (SLs) are imprecise. If we introduce vagueness into SLs, we face quite new, reasonable and interesting processes, and the ordinary capability indices are not appropriate for measuring the capability of these processes. In this paper, similar to the traditional process capability indices (PCIs), we develop a fuzzy analogue by a distance defined on a fuzzy limit space and introduce PCIs, where instead of precise SLs we have two membership functions for upper and lower SLs. These indices are necessary when SLs are fuzzy, and they are helpful for comparing manufacturing process with fuzzy SLs. Some interesting relations among these introduced indices are proved. Numerical examples are given to clarify the method.  相似文献   

8.
In this paper, we propose the quick switching sampling system for assuring mean life of a product under time truncated life test where the lifetime of the product follows the Weibull distribution and the mean life is considered as the quality of the product. The optimal parameters of the proposed system are determined using two points on the operating characteristic curve approach for various combinations of consumer's risk and ratio of true mean life time and specified life time. Tables are constructed to determine the optimal parameters for specified acceptable quality level and limiting quality level along with the corresponding probabilities of acceptance. The proposed system is compared with other existing sampling plans under Weibull lifetime model. In addition, an economical design of the proposed system is also discussed.  相似文献   

9.
Chen (1999) proposed an economic design, using Taguchi's quality loss function, for choosing a producer's lower specification limit eta for a product with a quality characteristic that has an exponential distribution with mean θ and 'the larger the better' tolerance. Chen (1999) developed an approximate solution that is applicable when 0.5 r m /θ r 0.7 and that requires numerical minimization. We derive a simple, exact solution that is applicable for all values of m /θ and does not require numerical minimization.  相似文献   

10.
Abstract

This article focuses on the problem of estimating the shelf life of food products by modeling the results coming from sensory evaluations. In such studies, trained panelists are asked to judge food attributes by reference to a scale of numbers (scores varying often from 0 to 6). The usual statistical approach for data analysis is to fit a regression line relating the scores and the time of evaluation. The estimate of the shelf life is obtained by solving the regression equation and replacing the score by a cut-off point (which indicates product “failure”) previously chosen by the food company. The procedure used in these sensory evaluations is such that one never knows the exact “time to failure”. Consequently, data arising from these studies are either right or left censored. We propose a model which incorporates these informations and assumes a Weibull for the underlying distribution of the failure time. Simulation studies were implemented. The approach was used in a real data set coming from sensory evaluations of a dehydrated food product.  相似文献   

11.
Shelf life is a specified percentile of the time-until-spoilage distribution of a food product. This paper investigates statistical properties of various estimators of shelf life and develops a genetic algorithm for finding near-optimal staggered designs for estimation of shelf life. MLEs and their associated confidence intervals for shelf life have smaller bias, better performance, and better coverage than the corresponding ad hoc regression-based estimates. However, performance of MLEs for common sample sizes must be evaluated by simulation. The genetic algorithm, coded as an SAS macro, searched the design space well and generated near-optimal designs as measured by improvement to a simulation-based performance measure.  相似文献   

12.
In this paper, an optimization model is developed for the economic design of a rectifying inspection sampling plan in the presence of two markets. A product with a normally distributed quality characteristic with unknown mean and variance is produced in the process. The quality characteristic has a lower specification limit. The aim of this paper is to maximize the profit, which consists the Taguchi loss function, under the constraints of satisfying the producer's and consumer's risk in two different markets simultaneously. Giveaway cost per unit of sold excess material is considered in the proposed model. A case study is presented to illustrate the application of proposed methodology. In addition, sensitivity analysis is performed to study the effect of model parameters on the expected profit and optimal solution. Optimal process adjustment problem and acceptance sampling plan is combined in the economical optimization model. Also, process mean and standard deviation are assumed to be unknown value, and their impact is analyzed. Finally, inspection error is considered, and its impact is investigated and analyzed.  相似文献   

13.
Nonlinear regime-switching behavior and structural change are often perceived as competing alternatives to linearity. In this article we study the so-called time-varying smooth transition autoregressive (TV-STAR) model, which can be used both for describing simultaneous nonlinearity and structural change and for distinguishing between these features. Two modeling strategies for empirical specification of TV-STAR models are developed. Monte Carlo simulations show that neither of the two strategies dominates the other. A specific-to-general-to-specific procedure is best suited for obtaining a first impression of the importance of nonlinearity and/or structural change for a particular time series. A specific-to-general procedure is most useful in careful specification of a model with nonlinear and/or time-varying properties. An empirical application to a large dataset of U.S. macroeconomic time series illustrates the relative merits of both modeling strategies.  相似文献   

14.
Nonparametric model specification for stationary time series involves selections of the smoothing parameter (bandwidth), the lag structure and the functional form (linear vs. nonlinear). In real life problems, none of these factors are known and the choices are interdependent. In this article, we recommend to accomplish these choices in one step via the model selection approach. Two procedures are considered; one based on the information criterion and the other based on the least squares cross validation. The Monte Carlo simulation results show that both procedures have good finite sample performances and are easy to implement compared to existing two-step probabilistic testing procedures.  相似文献   

15.
A double acceptance sampling plan for the truncated life test is developed assuming that the lifetime of a product follows a generalized log-logistic distribution with known shape parameters. The zero and one failure scheme is mainly considered, where the lot is accepted if no failures are observed from the first sample and it is rejected if two or more failures occur. When there is one failure from the first sample, the second sample is drawn and tested for the same duration as the first sample. The minimum sample sizes of the first and second samples are determined to ensure that the true median life is longer than the given life at the specified consumer’s confidence level. The operating characteristics are analyzed according to various ratios of the true median life to the specified life. The minimum such ratios are also obtained so as to lower the producer’s risk at the specified level. The results are explained with examples.  相似文献   

16.
从属性、构建方法及意义等方面,分析研究线性回归模型在计量经济学和统计学两学科视角下的差异,并根据这种差异进一步提出回归模型的基本设定思路。研究表明:识别这种差异是完成模型设定工作的基础性和必要性举措,有助于实现线性回归模型的正确设定。以经典例证对计量经济学和统计学回归模型在应用中的区别以及模型设定问题进行进一步展示和分析。  相似文献   

17.
首先对单位根检验的两类常见的数据生成系统进行比较,然后利用蒙特卡洛实验研究了时间序列单位根检验式的设定问题。研究发现在利用DF检验和DF-GLS检验进行时间序列的单位根检验时,检验式设定错误直接影响着检验结果,尤其在推断时间序列是趋势平稳过程还是有时间趋势项的随机游走过程或有二阶时间趋势多项式的随机游走过程时,检验式的错误设定很容易将趋势平稳过程误判为非平稳过程。  相似文献   

18.
Traditionally, most acceptance sampling plans considering the fraction defective do not distinguish among the products that fall within the specification limits. However, products that fall within the specification limits may not be good if their mean is far away from the target. So, developing an acceptance sampling plan with process loss consideration is essential. In this paper, a variable repetitive group sampling plan is proposed to deal with process loss. The design parameters of the proposed plan are tabulated for various combinations of acceptance quality levels. The proposed methodology can be used to determine whether the products meet the desired levels of protection for both producers and consumers.  相似文献   

19.
Point processes are the stochastic models most suitable for describing physical phenomena that appear at irregularly spaced times, such as the earthquakes. These processes are uniquely characterized by their conditional intensity, that is, by the probability that an event will occur in the infinitesimal interval (t, t+Δt), given the history of the process up tot. The seismic phenomenon displays different behaviours on different time and size scales; in particular, the occurrence of destructive shocks over some centuries in a seismogenic region may be explained by the elastic rebound theory. This theory has inspired the so-called stress release models: their conditional intensity translates the idea that an earthquake produces a sudden decrease in the amount of strain accumulated gradually over time along a fault, and the subsequent event occurs when the stress exceeds the strength of the medium. This study has a double objective: the formulation of these models in the Bayesian framework, and the assignment to each event of a mark, that is its magnitude, modelled through a distribution that depends at timet on the stress level accumulated up to that instant. The resulting parameter space is constrained and dependent on the data, complicating Bayesian computation and analysis. We have resorted to Monte Carlo methods to solve these problems.  相似文献   

20.
This paper suggests a flexible parametrization of the generalized Poisson regression, which is likely to be particularly useful when the sample is truncated at zero. Suitable specification tests for this case are also studied. The use of the models and tests suggested is illustrated with an application to the number of recreational fishing trips taken by households in Alaska  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号