首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The main purposes of this paper are to derive Bayesian acceptance sampling plans regarding the number of defects per unit of product, and to illustrate how to apply the methodology to the paper pulp industry. The sampling plans are obtained following an economic criterion: minimize the expected total cost of quality. It has been assumed that the number of defects per unit of product follows a Poisson distribution with process average 5 , whose prior information is described either for a gamma or for a non- informative distribution. The expected total cost of quality is composed of three independent components: inspection, acceptance and rejection. Both quadratic and step-loss functions have been used to quantify the cost incurred for the acceptance of a lot containing units with defects. Combining the prior information on 5 with the loss functions, four different sampling plans are obtained. When the quadratic-loss function is used, an analytical relation between the optimum settings of the sample size and the acceptance number is derived. The robustness analysis indicates that the sampling plans obtained are robust with respect to the prior distribution of the process average as well as to the misspecification of its mean and variance.  相似文献   

2.
Since multi-attribute control charts have received little attention compared with multivariate variable control charts, this research is concerned with developing a new methodology to employ the multivariate exponentially weighted moving average (MEWMA) charts for m-attribute binomial processes; the attributes being the number of nonconforming items. Moreover, since the variable sample size and sampling interval (VSSI) MEWMA charts detect small process mean shifts faster than the traditional MEWMA, an economic design of the VSSI MEWMA chart is proposed to obtain the optimum design parameters of the chart. The sample size, the sampling interval, and the warning/action limit coefficients are obtained using a genetic algorithm such that the expected total cost per hour is minimized. At the end, a sensitivity analysis has been carried out to investigate the effects of the cost and the model parameters on the solution of the economic design of the VSSI MEWMA chart.  相似文献   

3.
The close relationship between quality and maintenance of manufacturing systems has contributed to the development of integrated models which use the concept of statistical process control (SPC) and maintenance. This article demonstrates the integration of the Shewhart individual-residual (ZX ? Ze) joint control chart and maintenance for two-stage dependent processes by jointly optimizing their policies to minimize the expected total costs associated with quality, maintenance and inspection. To evaluate the effectiveness of the proposed model, two stand-alone models—a maintenance model and an SPC model—are proposed. Then a numerical example is given to illustrate the application of the proposed integrated model. The results show that the integrated model outperforms the two stand-alone models with regard to the expected cost per unit time. Finally, a sensitivity analysis is conducted to develop insights into time parameters and cost parameters that influence the integration efforts.  相似文献   

4.
In this article, we consider the problem of estimating the population mean of a study variable in the presence of non-response in a mail survey design. We introduce calibrated estimators of the population mean of a study variable in the presence of a known auxiliary variable. Using simulation the proposed calibrated estimators of population mean are compared to the Hansen and Hurwitz (1946) estimator under different situations for fixed cost as well for fixed sample size. The results are then extended for the use of multi-auxiliary information and stratified random sampling. We consider the problem of estimating the average total family income in the US in the presence of known auxiliary information on total income per person, age of the person, and poverty. We compute the relative efficiency of the proposed estimator over the Hansen and Hurwitz (1946) estimator through the use of large real datasets. Results are also presented for sub-populations consisting of whites, blacks, others, and two or more races in addition to considering them together in a population.  相似文献   

5.
This article considers the problem of estimating the parameters of Weibull distribution under progressive Type-I interval censoring scheme with beta-binomial removals. Classical as well as the Bayesian procedures for the estimation of unknown model parameters have been developed. The Bayes estimators are obtained under SELF and GELF using MCMC technique. The performance of the estimators, has been discussed in terms of their MSEs. Further, expression for the expected number of total failures has been obtained. A real dataset of the survival times for patients with plasma cell myeloma is used to illustrate the suitability of the proposed methodology.  相似文献   

6.
In this paper, an optimization model is developed for the economic design of a rectifying inspection sampling plan in the presence of two markets. A product with a normally distributed quality characteristic with unknown mean and variance is produced in the process. The quality characteristic has a lower specification limit. The aim of this paper is to maximize the profit, which consists the Taguchi loss function, under the constraints of satisfying the producer's and consumer's risk in two different markets simultaneously. Giveaway cost per unit of sold excess material is considered in the proposed model. A case study is presented to illustrate the application of proposed methodology. In addition, sensitivity analysis is performed to study the effect of model parameters on the expected profit and optimal solution. Optimal process adjustment problem and acceptance sampling plan is combined in the economical optimization model. Also, process mean and standard deviation are assumed to be unknown value, and their impact is analyzed. Finally, inspection error is considered, and its impact is investigated and analyzed.  相似文献   

7.
In this article, we discuss how to identify longitudinal biomarkers in survival analysis under the accelerated failure time model and also discuss the effectiveness of biomarkers under the accelerated failure time model. Two methods proposed by Shcemper et al. are deployed to measure the efficacy of biomarkers. We use simulations to explore how the factors can influence the power of a score test to detect the association of a longitudinal biomarker and the survival time. These factors include the functional form of the random effects from the longitudinal biomarkers, in the different number of individuals, and time points per individual. The simulations are used to explore how the number of individuals, the number of time points per individual influence the effectiveness of the biomarker to predict survival at the given endpoint under the accelerated failure time model. We illustrate our methods using a prothrombin index as a predictor of survival in liver cirrhosis patients.  相似文献   

8.
Longitudinal count responses are often analyzed with a Poisson mixed model. However, under overdispersion, these responses are better described by a negative binomial mixed model. Estimators of the corresponding parameters are usually obtained by the maximum likelihood method. To investigate the stability of these maximum likelihood estimators, we propose a methodology of sensitivity analysis using local influence. As count responses are discrete, we are unable to perturb them with the standard scheme used in local influence. Then, we consider an appropriate perturbation for the means of these responses. The proposed methodology is useful in different applications, but particularly when medical data are analyzed, because the removal of influential cases can change the statistical results and then the medical decision. We study the performance of the methodology by using Monte Carlo simulation and applied it to real medical data related to epilepsy and headache. All of these numerical studies show the good performance and potential of the proposed methodology.  相似文献   

9.
ABSTRACT

Supersaturated designs (SSDs) constitute a large class of fractional factorial designs which can be used for screening out the important factors from a large set of potentially active ones. A major advantage of these designs is that they reduce the experimental cost dramatically, but their crucial disadvantage is the confounding involved in the statistical analysis. Identification of active effects in SSDs has been the subject of much recent study. In this article we present a two-stage procedure for analyzing two-level SSDs assuming a main-effect only model, without including any interaction terms. This method combines sure independence screening (SIS) with different penalty functions; such as Smoothly Clipped Absolute Deviation (SCAD), Lasso and MC penalty achieving both the down-selection and the estimation of the significant effects, simultaneously. Insights on using the proposed methodology are provided through various simulation scenarios and several comparisons with existing approaches, such as stepwise in combination with SCAD and Dantzig Selector (DS) are presented as well. Results of the numerical study and real data analysis reveal that the proposed procedure can be considered as an advantageous tool due to its extremely good performance for identifying active factors.  相似文献   

10.
In a discrete-part manufacturing process, the noise is often described by an IMA(1,1) process and the pure unit delay transfer function is used as the feedback controller to adjust it. The optimal controller for this process is the well-known minimum mean square error (MMSE) controller. The starting level of the IMA(1,1) model is assumed to be on target when it starts. Considering such an impractical assumption, we adopt the starting offset. Since the starting offset is not observable, the MMSE controller does not exist. An alternative to the MMSE controller is the minimum asymptotic mean square error controller, which makes the long-run mean square error minimum.Another concern in this article is the un-stability of the controller, which may produce high adjustment costs and/or may exceed the physical bounds of the process adjustment. These practical barriers will prevent the controller to adjust the process properly. To avoid this dilemma, a resetting design is proposed. That is, the resetting procedure in use of the controller is to adjust the process according to the controller when it remains within the reset limit, and to reset the process, otherwise.The total cost for the manufacturing process is affected by the off-target cost, the adjustment cost, and the reset cost. Proper values for the reset limit are selected to minimize the average cost per reset interval (ACR) considering various process parameters and cost parameters. A time non-homogeneous Markov chain approach is used for calculating the ACR. The effect of adopting the starting offset is also studied here.  相似文献   

11.
A method of estimating a variety of curves by a sequence of piecewise polynomials is proposed, motivated by a Bayesian model and an appropriate summary of the resulting posterior distribution. A joint distribution is set up over both the number and the position of the knots defining the piecewise polynomials. Throughout we use reversible jump Markov chain Monte Carlo methods to compute the posteriors. The methodology has been successful in giving good estimates for 'smooth' functions (i.e. continuous and differentiable) as well as functions which are not differentiable, and perhaps not even continuous, at a finite number of points. The methodology is extended to deal with generalized additive models.  相似文献   

12.
This paper proposes a methodology to model the mobility of characters in Massively Multiplayer On-line (MMO) Games. We propose to model the mobility of characters in the map of an MMO game as a jump process using two approaches to model the times spent in the states of the process: parametric and non-parametric. Furthermore, a simulator for the mobility is presented. We analyze geographic position data of the characters in the map of the game World of Warcraft and compare the observed and simulated data. The proposed methodology and the simulator can be used to optimize computing load allocation of servers, which is extremely important for game performance, service quality and cost.  相似文献   

13.
The use of bivariate distributions plays a fundamental role in survival and reliability studies. In this paper, we consider a location scale model for bivariate survival times based on the proposal of a copula to model the dependence of bivariate survival data. For the proposed model, we consider inferential procedures based on maximum likelihood. Gains in efficiency from bivariate models are also examined in the censored data setting. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and compared to the performance of the bivariate regression model for matched paired survival data. Sensitivity analysis methods such as local and total influence are presented and derived under three perturbation schemes. The martingale marginal and the deviance marginal residual measures are used to check the adequacy of the model. Furthermore, we propose a new measure which we call modified deviance component residual. The methodology in the paper is illustrated on a lifetime data set for kidney patients.  相似文献   

14.
In an acceptance-sampling plan, where items of an incoming batch of products are inspected one by one, if the number of conforming items between successive non conforming items falls below a lower control threshold, the batch is rejected. If it falls above an upper control threshold, the batch is accepted, and if it lies within the thresholds then the process of inspecting the items continues. The purpose of this article is to develop an optimization model to determine the optimum values of the thresholds such that constraints on the probability of Type I and Type II errors are satisfied. This article starts by developing a Markovian model to derive the expected total cost of the inspection problem containing the costs of acceptance, rejection, and inspection. Then, the optimum values of the thresholds are selected in order to minimize the expected cost. To demonstrate the application of the proposed methodology, perform sensitivity analysis, and compare the performance of the proposed procedure to the one of another method, a numerical example is given at the end and the results are reported.  相似文献   

15.
We model the effect of a road safety measure on a set of target sites with a control area for each site, and we suppose that the accident data recorded at each site are classified in different mutually exclusive types. We adopt the before–after technique and we assume that at any one target site the total number of accidents recorded is multinomially distibuted between the periods and types of accidents. In this article, we propose a minorization–majorization (MM) algorithm for obtaining the constrained maximum likelihood estimates of the parameter vector. We compare it with a gradient projection–expectation maximization (GP-EM) algorithm, based on gradient projections. The performance of the algorithms is examined through a simulation study of road safety data.  相似文献   

16.
提出一种模糊多分配p枢纽站中位问题,其中运输成本定义为模糊变量,问题的目标函数是在给定的可信性水平下,最小化总的运输成本。对于梯形和正态运输成本,问题等价于确定的混合整数线性规划问题。在实证分析中,选取了辽宁省煤炭产业的相关面板数据,分析计算在不同可信度水平下煤炭运输枢纽站设立的数量和位置,再利用传统的优化方法(如分枝定界法)求解。经计算,这一模型和求解方法可以用来解决辽宁省煤炭运输的选址问题。  相似文献   

17.
Identifying cost-effective decisions that can take into account of medical cost and health outcome is an important issue under very limited resources. Analyzing medical costs has been challenged owing to skewness of cost distributions, heterogeneity across samples and censoring. When censoring is due to administrative reasons, the total cost might be related to the survival time since longer survivals are likely to be censored and the corresponding total cost will be censored as well. This paper uses the general linear model for the longitudinal data to model the repeated medical cost data and the weighted estimating equation is used to find more accurate estimates for the parameter. Furthermore, the asymptotic properties for the proposed model are discussed. Simulations are used to evaluate the performance of estimators under various scenarios. Finally, the proposed model is implemented on the data extracted from National Health Insurance database for patients with the colorectal cancer.  相似文献   

18.
Confirmatory bioassay experiments take place in late stages of the drug discovery process when a small number of compounds have to be compared with respect to their properties. As the cost of the observations may differ considerably, the design problem is well specified by the cost of compound used rather than by the number of observations. We show that cost-efficient designs can be constructed using useful properties of the minimum support designs. These designs are particularly suited for studies where the parameters of the model to be estimated are known with high accuracy prior to the experiment, although they prove to be robust against typical inaccuracies of these values. When the parameters of the model can only be specified with ranges of values or by a probability distribution, we use a Bayesian criterion of optimality to construct the required designs. Typically, the number of their support points depends on the prior knowledge for the model parameters. In all cases we recommend identifying a set of designs with good statistical properties but different potential costs to choose from.  相似文献   

19.
Economic design of continuous sampling plan under linear inspection cost   总被引:2,自引:0,他引:2  
The article explores the problem of an economically based type I continuous sampling plan (CSP-1 plan) under linear inspection cost. By assuming that the per unit inspection cost is linearly proportional to the average number of inspections per inspection cycle, and by solving the modified Cassady et al.'s model, we not only have the required level of product quality but also obtain the minimum total expected cost per unit produced.  相似文献   

20.
The problem of the estimation of mean frequency of events in the presence of censoring is important in assessing the efficacy, safety and cost of therapies. The mean frequency is typically estimated by dividing the total number of events by the total number of patients under study. This method, referred to in this paper as the ‘naïve estimator’, ignores the censoring. Other approaches available for this problem require many assumptions that are rarely acceptable. These include the assumption of independence, constant hazard rate over time and other similar distributional assumptions. In this paper a simple non‐parametric estimator based on the sum of the products of Kaplan–Meier estimators is proposed as an estimator of mean frequency, and its approximate variance and standard error are derived. An illustration is provided to show the derivation of the proposed estimator. Although the clinical trial setting is used in this paper, the problem has applications in other areas where survival analysis is used and recurrent events are studied. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号