首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
考虑噪声因子的参数和公差经济性设计策略   总被引:1,自引:0,他引:1  
基于并行工程思想的参数和公差经济性设计,以成本为优化目标来确定最优的设计参数和公差范围.就并行参数和公差设计通过双响应曲面均值和方差模型实现其稳健性,本文提出了包含噪声因子的响应曲面建模策略,应用这一模型将参数和公差同时优化.该参数和公差经济性设计优化模型以质量损失和制造成本之和为目标函数,以过程方差置信域为约束.实例结果表明,在双响应曲面方法中,对均值设置偏倚量比其刚性约束具有较大的成本优势,在不降低稳健性的前提下引入偏倚可以降低成本.面向噪声因子的设计策略其成本低于基于双响应的有偏倚的设计,说明考虑噪声因子的设计是最有效的.  相似文献   

2.
The pursuit of better performance has led to a number of business-academe collaborations. These collaborators have developed a number of sophisticated approaches that go far beyond such traditional simple methods as benchmarking against the best company, Ishikawa diagrams on feedback and control, Pareto diagrams, incentive systems based solely on output or quality, standard process control charts, and separate treatment of control charts and product inspection. The authors in this special issue report on approaches like benchmarking industrial performance through industry studies; the use of an artificial-intelligence statistical-tree growing method to analyze complex customer service data; an incentive system based on the total quality management (TQM) concepts of continuous improvement, teamwork, adaptation to change, and a focus on customer satisfaction; and integration of product inspection and process control. Because of the continuing widespread interest in TQM, there is an opportunity to take stock of how successful TQM initiatives have been and how we should consolidate and further extend the knowledge in TQM. Two of the papers report on the gap between what organizations espouse as TQM and what they actually implement and on the literature on TQM.  相似文献   

3.
存在方差持续性的资本资产定价模型分析   总被引:5,自引:3,他引:5  
自回归条件异方差(ARCH) 类模型突破了传统计量经济分析的同方差假定,对现代资本 资产定价理论产生了深远的影响. 随着对时变方差研究的深入,方差持续性也日益受到人们的 重视. 文章首先介绍了条件均值、条件方差以及在自回归条件异方差的基础上介绍了方差持续 性的有关概念和性质,并将之用于资本资产定价模型的研究,讨论了条件方差持续性对资本资 产定价模型的影响,并且进一步讨论了在多资产条件下向量GARCH 模型持续性对组合投资 的影响.  相似文献   

4.
In this paper, we study the problem of selecting the optimum production batch size in multistage manufacturing facilities with scrap and determining the optimal amount of investment. We analyse the effect of investment for quality improvement on the reduction of the proportion of defectives, and the effect of this reduction on processing cost, setup cost, holding cost, and profit loss. The quality characteristic of the product manufactured is assumed to be normally distributed with a mean equal to the target value. The purpose of the investment is to reduce the variance of the quality characteristic and hence the proportion of defectives. The model assumes known demand, which must be satisfied completely, scrappage at each stage and profit loss due to scrap. Using this model, the optimal values of the production quantity and the proportion of defectives for minimizing the total cost are obtained. The optimal investment is then obtained using the relationship between the investment and the proportion of defectives.  相似文献   

5.
ABSTRACT

The purpose of this scoping review was to examine the literature on team resilience to gain insight into current thinking regarding its definition and conceptualisation, and to identify how researchers have operationalised and measured this concept. We conducted a systematic scoping review using the 5-phase approach proposed by Arksey and O’Malley. A total of seven databases were searched, followed by a citation search of eligible papers via Google Scholar. Of the 275 articles identified via the search process, 27 papers were deemed eligible for review. Several key findings regarding the literature on team resilience were observed: (i) definitions varied in terms of content (e.g. input or process), breadth (e.g. unidimensional versus multidimensional), and quality (e.g. essential and necessary attributes of key components); (ii) there was a predominance of single-level conceptualisations of team resilience; and (iii) there has been a reliance on cross-sectional research designs in empirical studies, which is incongruent with the dynamic nature of this concept. Key recommendations from this scoping review focus on definitional, theoretical, and methodological issues.  相似文献   

6.
Two recent papers on risk perception models are discussed. In these papers, quantitative analyses are presented of risk perception in relation to risk characteristics as specified in the Psychometric Model, and to cultural biases according to Cultural Theory. This comment points out that the data quality of these two studies is doubtful, with a very small convenience sample and a very low response rate. More importantly, the analyses show the same low levels of explained variance of risk perception as other researchers have found previously, but the authors still draw optimistic conclusions from their data. Such conclusions are unjustified.  相似文献   

7.
We analyse a three echelon supply chain model. First-order autoregressive end consumer demand is assumed. We obtain exact analytical expressions for bullwhip and net inventory variance at each echelon in the supply chain. All of the three supply chain participants employ the order-up-to policy with the minimum mean square error forecasting scheme. After demonstrating that the character of the stochastic ordering process observed at each level of the supply chain is mathematically tractable, we show that the upper stream participants have complete information of the market demand process. Then we quantify the bullwhip produced by the system, together with the amplification ratios of the variance of the net inventory levels. Our analysis reveals that the level of the supply chain has no impact upon the bullwhip effect, rather bullwhip is determined by the accumulated lead-time from the customer and the local replenishment lead-time. We also find that the conditional variance of the forecast error over the lead-time is identical to the variance of the net inventory levels and that the net inventory variance is dominated by the local replenishment lead-time.  相似文献   

8.
The future of the global industry lies in the continuous improvement of both products and processes, a renewed commitment to competition, and an aggressive approach to satisfying customers needs in quality, quantity, and timing. In quality management, the degree of customer satisfaction for a given product may be measured in the form of the loss to society. This loss is formulated as a function of the deviation from the target for each of the product's quality characteristics. The greater the variability of uncontrolled factors during manufacturing or production the larger will be that loss. In this paper, we develop a form of the loss function that takes into account the variability of a production process, the decision loss, and the costs of sampling and inspection. Specifically, we consider monitoring a production process, which may undergo continuous mean shift and variance deterioration during a production run. We then examine decision rules for continuing production or stopping and adjusting the production process.  相似文献   

9.
Existing research works on process quality improvement focus largely on the linkages between quality improvement cost and production economics such as set-up cost and defect rate reduction. This paper deals with the optimal design problem for process improvement by balancing the sunk investment cost and revenue increments due to the process improvement. We develop an optimal model based on Taguchi cost functions. The model is validated through a real case study in automotive industry where the 6-sigma DMAIC methodology has been applied. According to this research, the management can adjust the investment on prevention and appraisal costs on quality improvement that enhances process capability, reduces product defect rate and, as a result, generates remarkable financial return.  相似文献   

10.
A growing recognition that quality management is an important factor in defining a firm's competitive position has led to renewed attention to this function and has resulted in implementation of elaborate systems for on-line quality control comprising product inspection and process control. Traditionally, these functions have been treated independently, with very little interaction. In this paper we examine, in detail, a scheme that integrates the two functions, and we demonstrate that such an approach can result in significant cost savings. The motivation for this work comes from our experience in a wafer fabrication facility that suggested that exchange of quality information between different stages of production could result in significant performance improvements. To illustrate this approach, we consider a specific environment characterized by a single-stage continuous production process whose status is monitored by an X̄ control chart. We assume that quality-related costs may be described as a function of the process output. This is analogous to Taguchi's quality loss function and may be interpreted as a generalization of conventional classification of process output as either acceptable or defective units. The integrative scheme essentially relies on utilization of the process status information (based on process control) in making product inspection decisions. For this system we derive a cost model and develop a solution procedure to determine optimal decision parameters. Limited computational results indicate that the scheme has significant potential for reducing quality-related costs.  相似文献   

11.
Due to the ever-growing amount of data, computer-aided methods and systems to detect weak signals and trends for corporate foresight are in increasing demand. To this day, many papers on this topic have been published. However, research so far has only dealt with specific aspects, but it has failed to provide a comprehensive overview of the research domain. In this paper, we conduct a systematic literature review to organize existing insights and knowledge. The 91 relevant papers, published between 1997 and 2017, are analyzed for their distribution over time and research outlets. Classifying them by their distinct properties, we study the data sources exploited and the data mining techniques applied. We also consider eight different purposes of analysis, namely weak signals and trends concerning political, economic, social and technological factors. The results of our systematic review show that the research domain has indeed been attracting growing attention over time. Furthermore, we observe a great variety of data mining and visualization techniques, and present insights on the efficacy and effectiveness of the data mining techniques applied. Our results reveal that a stronger emphasis on search strategies, data quality and automation is required to greatly reduce the human actor bias in the early stages of the corporate foresight process, thus supporting human experts more effectively in later stages such as strategic decision making and implementation. Moreover, systems for detecting weak signals and trends need to be able to learn and accumulate knowledge over time, attaining a holistic view on weak signals and trends, and incorporating multiple source types to provide a solid foundation for strategic decision making. The findings presented in this paper point to future research opportunities, and they can help practitioners decide which sources to exploit and which data mining techniques to apply when trying to detect weak signals and trends.  相似文献   

12.
Employee-based errors result in quality defects that can often impact customer satisfaction. This study examined the effects of a process change and feedback system intervention on error rates of 3 teams of retail furniture distribution warehouse workers. Archival records of error codes were analyzed and aggregated as the measure of quality. The intervention consisted of a process change where teams of 5 employees who had previously been assigned a specific role within the process were cross-trained to know and help with other team members' functions. Additionally, these teams were given performance feedback on an immediate, daily, and weekly basis. Team A reduced mean errors from 7.47 errors per week during baseline to 3.53 errors per week during the intervention phase. Team B experienced a reduction in mean number of weekly errors from a baseline of 11.39 errors per week to 3.82 errors per week during the intervention phase. Team C did not experience significant error rate reduction.  相似文献   

13.
Discount Rates in Risk Versus Money and Money Versus Money Tradeoffs   总被引:1,自引:0,他引:1  
We use data from a survey of residents of five Italian cities conducted in late spring 2004 to estimate the discount rates implicit in (1) money versus future risk reductions and (2) money versus money tradeoffs. We find that the mean personal discount rate is 0.3-1.7% in (1) and 8.7% in (2). The latter is lower than the discount rates estimated in comparable situations in many recent studies, greater than market interest rates in Italy at the time, and exhibits modest variation with age and gender. The discount rate implicit in money versus risk tradeoffs is within the range of estimates from studies in the United States and Europe, and does not depend on observable individual characteristics. We use split samples to investigate whether a completely abstract risk reduction - one where the risk reduction delivery has been stripped of all specifics, so that respondents should focus on the risks without being distracted by details - results in WTP and discount figures comparable to those from an identified delivery mechanism (a medical test). We find that while WTP for an immediate risk reduction is 42-73% higher with the abstract risk reduction, the discount rate in the money versus risk tradeoffs and the variance of the error term in the WTP equation are the same across the two variants of the questionnaire.  相似文献   

14.
Run-length distributions for various statistical process-control charts and techniques for computing them recently have been reported in the literature. The real advantages of knowing the run-length distribution for a process-control chart versus knowing only the associated average-run length of the chart have not been exploited. Our purpose is to use knowledge of the run-length distribution as an aid in deciding if an out-of-control signal is a true signal or merely a false alarm. The ability to distinguish between true and false signals is important, especially in operations where it is costly to investigate the causes of out-of-control conditions. Knowledge of the run-length distribution allows us to compute likelihood ratios, which are simple to calculate and to interpret and which are used to determine the odds of obtaining an out-of-control signal at a particular run length when a shift in the process mean actually has occurred vis-a-vis no such shift. We extend our analysis in a Bayesian sense by incorporating prior information on the distribution of the shift size of the process mean, combined with the likelihood ratio obtained from the run-length distribution, to determine if a shift larger than a critical size has occurred. We give examples for the Shewhart chart, the exponentially weighted moving-average chart, and the special-cause control chart for processes with autocorrelated observations. The examples show that the current recommended usage of the average-run length alone as a guide for determining whether a signal is a false alarm or otherwise can be misleading. We also show that the performance of the traditional charts, in terms of their average-run length, can be enhanced in many instances by using the likelihood-ratio procedure.  相似文献   

15.
In recent years, manufacturing firms have realized that a new, higher level of global competition causes them to compete simultaneously on multiple manufacturing goals, such as quality, delivery, cost, and flexibility. In response to this realization, considerable research now focuses on the relationship of manufacturing improvement programs to manufacturing goals. However, to date, this research has not investigated the specific underlying statistical relationships between manufacturing goals and the shop floor. This study investigates manufacturing lead time linkages with manufacturing programs and manufacturing goals. The basic purpose of this study is to understand and explain how programs affect the elements of manufacturing lead time and how manufacturing lead time affects manufacturing goal capabilities. By understanding these linkages, managers can logically trace the effects of specific programs to their eventual effects on manufacturing goal capabilities. This study's most important finding is that statistical variations in the elements of lead time cause a tendency for certain manufacturing goals to be more difficult to control and achieve than others because of canonical relationships of lead time variances. To control these lead time variances, successful firms concentrate their early program targets first on achieving “fitness for use” quality, followed by delivery reliability, short delivery lead time and cost, current product flexibility, and lastly, new product flexibility. This study mathematically illustrates which improvement programs most affect manufacturing goals through their relationship to manufacturing lead time variance reduction. It suggests that firms improve goal performance by initially targeting improvement through setup time reduction programs, defect reduction programs, and preventive maintenance programs, to facilitate quality improvements. By targeting specific programs and their related lead time variances, firms improve their manufacturing facility competitiveness with minimum obstacles.  相似文献   

16.
传统的过程能力分析假定过程存在单一变异源,在利用方差分量法估计嵌套型多变异源问题的总方差和各方差分量的基础上,给出了过程能力指数的估计方法,并推导了Cp的置信区间.案例应用表明:由于MVA方法对变异来源进行了充分分析和估计,所以能够科学合理的对过程进行评价.  相似文献   

17.
This article presents a new importance analysis framework, called parametric moment ratio function, for measuring the reduction of model output uncertainty when the distribution parameters of inputs are changed, and the emphasis is put on the mean and variance ratio functions with respect to the variances of model inputs. The proposed concepts efficiently guide the analyst to achieve a targeted reduction on the model output mean and variance by operating on the variances of model inputs. The unbiased and progressive unbiased Monte Carlo estimators are also derived for the parametric mean and variance ratio functions, respectively. Only a set of samples is needed for implementing the proposed importance analysis by the proposed estimators, thus the computational cost is free of input dimensionality. An analytical test example with highly nonlinear behavior is introduced for illustrating the engineering significance of the proposed importance analysis technique and verifying the efficiency and convergence of the derived Monte Carlo estimators. Finally, the moment ratio function is applied to a planar 10‐bar structure for achieving a targeted 50% reduction of the model output variance.  相似文献   

18.
Companies undertaking operations improvement in supply chains face many alternatives. This work seeks to assist practitioners to prioritize improvement actions by developing analytical expressions for the marginal values of three parameters – (i) lead time mean, (ii) lead time variance, and (iii) demand variance – which measure the marginal cost of an incremental change in a parameter. The relative effectiveness of reducing lead time mean versus lead time variance is captured by the ratio of the marginal value of lead time mean to that of lead time variance. We find that this ratio strongly depends on whether the lead time mean and variance are independent or correlated. We illustrate the application of the results with a numerical example from an industrial setting. The insights can help managers determine the optimal investment decision to modify demand and supply characteristics in their supply chain, e.g., by switching suppliers, factory layout, or investing in information systems.  相似文献   

19.
The widespread recognition of the detrimental effects of high yield variation in advanced manufacturing technology settings, both in terms of cost and management of production processes, underscores the need to develop effective strategies for reducing yield variation. In this article, we report the findings of a longitudinal field study in an electromechanical motor assembly plant where we examined how the application of process knowledge by production work teams can reduce yield variation. We propose and provide an operationalizion of a strategy to identify the sequence of particular types of actions—actions to control the mean followed by actions to control the variance—that work teams should pursue over time to apply process knowledge for reducing yield variation. The results of our empirical analysis show that yield variation was significantly reduced on three of the four production lines at the manufacturing plant that served as our research site. Differences in strategies for applying process knowledge help explain the different results on each of the production lines.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号