首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 18 毫秒
1.
Taguchi method is found efficient for optimising process performance with a single quality characteristic (QCH) of a product or process. In practice, however, customers are concerned about multiple QCHs, which are usually correlated. This research proposes and implements an approach using principal components analysis (PCA) and two data envelopment analysis (DEA) models, including CCR and super efficiency, for optimising multiple correlated QCHs in robust design. The PCA is first utilised to obtain multiple uncorrelated linear combinations of principal components, which are the same number of QCHs and hence avoid the loss of information by ignoring some principal components. Then, these components are utilised in two DEA models to decide optimal factor levels. Three real case studies are provided for illustration; in all of which the proposed approach is found more efficient than some other techniques in literature, including engineering judgement, PCA, PCA and grey analysis, and utility concept. In conclusion, the proposed approach shall provide a great assistance to process/product engineers for obtaining robust design with multiple correlated QCHs.  相似文献   

2.
This paper presents a real application of a multicriteria decision aid (MCDA) approach to portfolio selection based on preference disaggregation, using ordinal regression and linear programming (UTADIS method; UTilités Additives DIScriminantes). The additive utility functions that are derived through this approach have the extrapolation ability that any new alternative (share) can be easily evaluated and classified into one of several user-predefined groups. The procedure is illustrated with a case study of 98 stocks from the Athens stock exchange, using 15 criteria. The results are encouraging, indicating that the proposed methodology could be used as a tool for the analysis of the portfolio managers' preferences and choices. Furthermore, the comparison with multiple discriminant analysis (either using a stepwise procedure or not) illustrates the superiority of the proposed methodology over a well-known multivariate statistical technique that has been extensively used to study financial decision-making problems.  相似文献   

3.
4.
How to determine weights for attributes is one of the key issues in multiple attribute decision making (MADM). This paper aims to investigate a new approach for determining attribute weights based on a data envelopment analysis (DEA) model without explicit inputs (DEA-WEI) and minimax reference point optimisation. This new approach first considers a set of preliminary weights and the most favourite set of weights for each alternative or decision making unit (DMU) and then aggregates these weight sets to find the best compromise weights for attributes with the interests of all DMUs taken into account fairly and simultaneously. This approach is intended to support the solution of such MADM problems as performance assessment and policy analysis where (a) the preferences of decision makers (DMs) are either unclear and partial or difficult to acquire and (b) there is a need to consider the best "will" of each DMU. Two case studies are conducted to show the property of this new proposed approach and how to use it to determine weights for attributes in practice. The first case is about the assessment of research strengths of EU-28 member countries under certain measures and the second is for analysing the performances of Chinese Project 985 universities, where the weights of the attributes need to be assigned in a fair and unbiased manner.  相似文献   

5.
The U.S. Environmental Protection Agency (USEPA) guidelines for cancer risk assessment recognize that some chemical carcinogens may have a site-specific mode of action (MOA) involving mutation and cell-killing-induced hyperplasia. The guidelines recommend that for such dual MOA (DMOA) carcinogens, judgment should be used to compare and assess results using separate "linear" (genotoxic) versus "nonlinear" (nongenotoxic) approaches to low-level risk extrapolation. Because the guidelines allow this only when evidence supports reliable risk extrapolation using a validated mechanistic model, they effectively prevent addressing MOA uncertainty when data do not fully validate such a model but otherwise clearly support a DMOA. An adjustment-factor approach is proposed to address this gap, analogous to reference-dose procedures used for classic toxicity endpoints. By this method, even when a "nonlinear" toxicokinetic model cannot be fully validated, the effect of DMOA uncertainty on low-dose risk can be addressed. Application of the proposed approach was illustrated for the case of risk extrapolation from bioassay data on rat nasal tumors induced by chronic lifetime exposure to naphthalene. Bioassay data, toxicokinetic data, and pharmacokinetic analyses were determined to indicate that naphthalene is almost certainly a DMOA carcinogen. Plausibility bounds on rat-tumor-type-specific DMOA-related uncertainty were obtained using a mechanistic two-stage cancer risk model adapted to reflect the empirical link between genotoxic and cytotoxic effects of the most potent identified genotoxic naphthalene metabolites, 1,2- and 1,4-naphthoquinone. Bound-specific adjustment factors were then used to reduce naphthalene risk estimated by linear extrapolation (under the default genotoxic MOA assumption), to account for the DMOA exhibited by this compound.  相似文献   

6.
We consider a problem of evaluating efficiency of Decision Making Units (DMUs) based on their deterministic performance on multiple consumed inputs and multiple produced outputs. We apply a ratio-based efficiency measure, and account for the Decision Maker׳s preference information representable with linear constraints involving input/output weights. We analyze the set of all feasible weights to answer various robustness concerns by deriving: (1) extreme efficiency scores and (2) extreme efficiency ranks for each DMU, (3) possible and necessary efficiency preference relations for pairs of DMUs, (4) efficiency distribution, (5) efficiency rank acceptability indices, and (6) pairwise efficiency outranking indices. The proposed hybrid approach combines and extends previous results from Ratio-based Efficiency Analysis and the SMAA-D method. The practical managerial implications are derived from the complementary character of accounted perspectives on DMUs׳ efficiencies. We present an innovative open-source software implementing an integrated framework for robustness analysis using a ratio-based efficiency model on the diviz platform. The proposed approach is applied to a real-world problem of evaluating efficiency of Polish airports. We consider four inputs related to the capacities of a terminal, runways, and an apron, and to the airport׳s catchment area, and two outputs concerning passenger traffic and number of aircraft movements. We present how the results can be affected by integrating the weight constraints and eliminating outlier DMUs.  相似文献   

7.
Multiple hazard resilience is of significant practical value because most regions of the world are subject to multiple natural and technological hazards. An analysis and assessment approach for multiple hazard spatiotemporal resilience of interdependent infrastructure systems is developed using network theory and a numerical analysis. First, we define multiple hazard resilience and present a quantitative probabilistic metric based on the expansion of a single hazard deterministic resilience model. Second, we define a multiple hazard relationship analysis model with a focus on the impact of hazards on an infrastructure. Subsequently, a relationship matrix is constructed with temporal and spatial dimensions. Further, a general method for the evaluation of direct impacts on an individual infrastructure under multiple hazards is proposed. Third, we present an analysis of indirect multiple hazard impacts on interdependent infrastructures and a joint restoration model of an infrastructure system. Finally, a simplified two‐layer interdependent infrastructure network is used as a case study for illustrating the proposed methodology. The results show that temporal and spatial relationships of multiple hazards significantly influence system resilience. Moreover, the interdependence among infrastructures further magnifies the impact on resilience value. The main contribution of the article is a new multiple hazard resilience evaluation approach that is capable of integrating the impacts of multiple hazard interactions, interdependence of network components (layers), and restoration strategy.  相似文献   

8.
A Heuristic Algorithm for Multiple Sequence Alignment Based on Blocks   总被引:2,自引:0,他引:2  
Blocked multiple sequence alignment refers to the construction of multiple alignment by first aligning conserved regions into what we call blocks and then aligning the regions between successive blocks to form a final alignment. Instead of starting from low order pairwise alignments we propose a new way to form blocks by searching for closely related regions in all input sequences, allowing internal spaces in blocks as well as some degree of mismatch. We address the problem of semi-conserved patterns (patterns that do not appear in all input sequences) by introducing into the process two similarity thresholds that are adjusted dynamically according to the input. A method to control the number of blocks is also presented to deal with the situation when input sequences have so many similar regions that it becomes impractical to form blocks by trying every combination. BMA is an implementation of this approach, and our experimental results indicatethat this approach is efficient, particularly on large numbers of long sequences with well-conserved regions.  相似文献   

9.
This paper addresses the critical issue of determining the appropriate structural design of a sales organization. We propose a new approach for organizing the selling function using multiobjective modeling techniques. The proposed approach considers the situational characteristics under which the sales organization operates, trade-offs among multiple sales-effectiveness criteria, and the “fit” between the situational characteristics and the structural dimensions (i.e., the structure-contingency relationships). Using data collected from sales branches of brokerage firms, we develop appropriate structural designs for a sales organization using the proposed procedure. The paper concludes by outlining the various decision-making implications of the approach.  相似文献   

10.
由于复杂时序存在结构性断点和异常值等问题,往往导致预测模型训练效果不佳,并可能出现极端预测值的情况。为此,本文提出了基于修剪平均的神经网络集成预测方法。该方法首先从训练数据中生成多组训练集,然后分别训练多个神经网络预测模型,最后将多个神经网络的预测结果使用修剪平均策略进行集成。相较于简单平均策略而言,修剪平均策略不容易受到极值的影响,能够使集成模型获得鲁棒性强的预测效果。在实证研究中,本文构造了两种神经网络集成预测模型,分别为基于修剪平均的自举神经网络集成模型(Trimmed Average based Bootstrap Neural Network Ensemble, TA-BNNE)和基于修剪平均的蒙特卡洛神经网络集成模型(Trimmed Average based Monte Carlo Neural Network Ensemble, TA-MCNNE),并采用这两种模型对NN3竞赛数据集进行预测,结果表明在常规和复杂数据集上,修剪平均策略比简单平均策略具有更好的预测精度。此外,本文将所提出的集成模型与NN3的前十名模型进行比较,发现两种模型在全部数据集上均超过了第6名,在复杂数据集上的表现均超过了第1名,进一步验证本文所提方法的有效性。  相似文献   

11.
冲突分析图模型中强度偏好的引入增加了决策者的感情色彩,进一步丰富了决策者的偏好信息。在强度偏好下运用冲突分析图模型理论对冲突进行建模时,有一个非常重要的问题——强度偏好的排序。本研究首先将简单偏好下的策略优先权排序法扩展到强度偏好;然后以"兰州水污染事件"为例,在兰州威立雅水务公司、地方政府和中石化兰州分公司存在强度偏好的情况下,运用扩展后的策略优先权排序法求取各决策者的强度偏好序列,并对此次水污染事件引发的冲突进行建模、分析,模拟了冲突事件中的谈判和协商过程,求得冲突各方的均衡解,案例分析过程也从战略层面为我国实现经济的可持续发展提供借鉴。  相似文献   

12.
在生产中,由于企业并未掌握关于生产技术的完全信息,因此可能导致不同生产要素具有不同的技术效率水平。而传统的"径向"技术效率测量方法并不能识别生产要素之间的技术效率差异。为解决这一问题,现有的文献中分别提出了基于利润函数和方向距离函数的测量方法。但利润函数的估计需要用到难以收集的价格数据;估计方向距离函数需要事先设定投入要素缩减的方向,而这一方向对于研究者是未知的。基于投入距离函数,本文构建了一个新的可以测量单要素技术效率的框架,且无需价格数据和事先设定"方向"。文章采用贝叶斯方法分两步估计该模型:首先得到模型参数的估计值;其次在模型参数估计值给定的基础上再估计单个要素的技术效率水平。蒙特卡洛模拟分析发现,与直接估计各要素的技术效率的方法相比,这种"两步法"可以更快的实现马尔科夫蒙特卡洛(MCMC)过程的收敛,并能够较为精确的估计各要素的技术效率水平。之后将该方法应用于北京大学企业社会责任调查的数据,估计了资本和劳动力的技术效率水平。结果显示,企业在利用资本中几乎不存在技术效率损失,并且企业间的资本技术效率水平无明显差异。企业技术效率损失主要来自于劳动力利用不足,且企业间劳动力技术效率水平差异较大。平均而言,劳动力的技术效率水平为77%,即在保持产出和资本投入不变的情况下,可以使劳动力的投入下降23%。这个例子表明,本文提出的方法可以识别生产中导致技术效率损失的主要原因,从而有助于找到提升生产效率的解决方案。  相似文献   

13.
针对传统的单阶段物资分配模型可能导致应急物资分配的局部冗余或短缺、高成本、系统无法达到全局最优等现实情况,通过在指数效用函数中引入灾民物资需求的比例短缺测度公平,以物资短缺的延迟损失最小化与物资分配的总成本最小化为目标构建考虑多集散点、多配送中心和多受灾点的三级配送网络的应急物资动态多阶段分配模型,设计了目标转化与线性近似相结合的模型求解方法,并通过算例对所提出模型的有效性和可行性进行了验证。结果表明:所提出的多阶段模型能够兼顾物资分配的效率与公平,最大程度地降低物资短缺的延迟损失以及物资分配的总成本;运用灾民物资需求的比例短缺量化公平,避免了由于各受灾点的需求量差异而对公平分配产生的影响,可以使各受灾点即使在应急救援初期物资有限、中期物资持续供不应求等情况下,仍然能够在每阶段获得一定比率的所需物资,进而避免较大的物资短缺损失,确保多受灾点之间多阶段应急物资分配的公平性,更符合灾害救援实际,可为现实大规模灾害应急救援物资多阶段分配提供决策支持。  相似文献   

14.
经济批量排产问题的一种排产方法   总被引:1,自引:0,他引:1  
李天凤  周支立  吴丽娜 《管理学报》2007,4(4):384-389,392
针对经济批量排产问题假定生产可以在库存降为0之前开始,并且提出新的算法求产品的生产顺序。结果表明,该排产方法成本要低于其他2种常用的经济批量排产问题的方法,并且给出了算法的时间复杂性。  相似文献   

15.
Life cycle assessment (LCA) is a framework for comparing products according to their total estimated environmental impact, summed over all chemical emissions and activities associated with a product at all stages in its life cycle (from raw material acquisition, manufacturing, use, to final disposal). For each chemical involved, the exposure associated with the mass released into the environment, integrated over time and space, is multiplied by a toxicological measure to estimate the likelihood of effects and their potential consequences. In this article, we explore the use of quantitative methods drawn from conventional single-chemical regulatory risk assessments to create a procedure for the estimation of the cancer effect measure in the impact phase of LCA. The approach is based on the maximum likelihood estimate of the effect dose inducing a 10% response over background, ED10, and default linear low-dose extrapolation using the slope betaED10 (0.1/ED10). The calculated effects may correspond to residual risks below current regulatory compliance requirements that occur over multiple generations and at multiple locations; but at the very least they represent a "using up" of some portion of the human population's ability to accommodate emissions. Preliminary comparisons are performed with existing measures, such as the U.S. Environmental Protection Agency's (U.S. EPA's) slope factor measure q1*. By analyzing bioassay data for 44 chemicals drawn from the EPA's Integrated Risk Information System (IRIS) database, we explore estimating ED10 from more readily available information such as the median tumor dose rate TD50 and the median single lethal dose LD50. Based on the TD50, we then estimate the ED10 for more than 600 chemicals. Differences in potential consequences, or severity, are addressed by combining betaED10 with the measure disability adjusted life years per affected person, DALYp. Most of the variation among chemicals for cancer effects is found to be due to differences in the slope factors (betaED10) ranging from 10(-4) up to 10(4) (risk of cancer/mg/kg-day).  相似文献   

16.
The current methods for a reference dose (RfD) determination can be enhanced through the use of biologically-based dose-response analysis. Methods developed here utilizes information from tetrachlorodibenzo- p -dioxin (TCDD) to focus on noncancer endpoints, specifically TCDD mediated immune system alterations and enzyme induction. Dose-response analysis, using the Sigmoid-Emax (EMAX) function, is applied to multiple studies to determine consistency of response. Through the use of multiple studies and statistical comparison of parameter estimates, it was demonstrated that the slope estimates across studies were very consistent. This adds confidence to the subsequent effect dose estimates. This study also compares traditional methods of risk assessment such as the NOAEL/safety factor to a modified benchmark dose approach which is introduced here. Confidence in the estimation of an effect dose (ED10) was improved through the use of multiple datasets. This is key to adding confidence to the benchmark dose estimates. In addition, the Sigmoid-Emax function when applied to dose-response data using nonlinear regression analysis provides a significantly improved fit to data increasing confidence in parameter estimates which subsequently improve effect dose estimates.  相似文献   

17.
A multiple objective embedded network model is proposed to model a variety of human resource planning problems including executive succession planning, compensation planning, training program design, diversity management and human systems design. The Tchebycheff Method, an interactive multiple objective programming solution procedure developed by Steuer and Choo [32], is implemented using NETSIDE, a computer routine for solving network problems with side constraints developed by Kennington and Whisman [17]. This paper demonstrates how the network structure common to many types of human resource planning problems can be exploited to improve solution efficiency, and how our approach extends the use of network models in human resource planning by including multiple objectives and extranetwork constraints. An illustrative example demonstrating the modeling and solution approach is presented, and the potential applications of these approaches in two specific areas of human resource planning are discussed.  相似文献   

18.
As a method of solving multiple-criteria decision making problems with a single quantitative objective and multiple qualitative objectives, the post-model analysis (PMA) approach is proposed. The essence of PMA is to support the trade-offs between a quantitative objective and multiple qualitative objectives so that the decision maker can find a perceived most preferred nondominated solution. To this end, the optimal solution of a quantitative model is found first, without regard for qualitative factors. The solution is then evaluated in terms of qualitative objectives. When the initial quantitatively optimal solution is adjusted to allow improvement of qualitative goals, opportunity costs of achieving qualitative goals are incurred. In this process, an expert system and/or graphical display can be used. PMA therefore provides a way to incorporate quantitative models into knowledge-based expert systems.  相似文献   

19.
Weight of Evidence: A Review of Concept and Methods   总被引:1,自引:0,他引:1  
Douglas L. Weed 《Risk analysis》2005,25(6):1545-1557
"Weight of evidence" (WOE) is a common term in the published scientific and policy-making literature, most often seen in the context of risk assessment (RA). Its definition, however, is unclear. A systematic review of the scientific literature was undertaken to characterize the concept. For the years 1994 through 2004, PubMed was searched for publications in which "weight of evidence" appeared in the abstract and/or title. Of the 276 papers that met these criteria, 92 were selected for review: 71 papers published in 2003 and 2004 (WOE appeared in abstract/title) and 21 from 1994 through 2002 (WOE appeared in title). WOE has three characteristic uses in this literature: (1) metaphorical, where WOE refers to a collection of studies or to an unspecified methodological approach; (2) methodological, where WOE points to established interpretative methodologies (e.g., systematic narrative review, meta-analysis, causal criteria, and/or quality criteria for toxicological studies) or where WOE means that "all" rather than some subset of the evidence is examined, or rarely, where WOE points to methods using quantitative weights for evidence; and (3) theoretical, where WOE serves as a label for a conceptual framework. Several problems are identified: the frequent lack of definition of the term "weight of evidence," multiple uses of the term and a lack of consensus about its meaning, and the many different kinds of weights, both qualitative and quantitative, which can be used in RA. A practical recommendation emerges: the WOE concept and its associated methods should be fully described when used. A research agenda should examine the advantages of quantitative versus qualitative weighting schemes, how best to improve existing methods, and how best to combine those methods (e.g., epidemiology's causal criteria with toxicology's quality criteria).  相似文献   

20.
Abstract

This article aims at providing an integrated approach for optimizing quality control in International Manufacturing Networks (IMN) which can be characterized by consisting of numerous plants acting autonomously according to an individual target system. A key challenge is to ensure the overall process quality despite distributed value creation processes and differing target systems in dynamic environments. Hence, the developed approach allows for identifying potentials in the quality control strategy using a value stream-based method to visualize quality characteristics and procedures in the production process chain. Furthermore, the approach contains a framework for identifying possible improvement measures and a simulation-based evaluation concept to evaluate the effects of different measures with respect to individual target systems. The simulation combines elements of a discrete-event simulation in order to depict the value stream with agent-based modelling for realizing different target systems by considering distinctive plant roles. The article concludes with a case study of a globally operating automotive supplier to apply the approach. A forward research agenda is proposed that evaluates the approach in multiple cases, deriving patters across companies or industries.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号