首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 250 毫秒
1.
In this note I reply to the comments by Haimes et al. on my paper on the sensitivity analysis of the inoperability input‐output model. I make the case for a moment‐independent sensitivity analysis.  相似文献   

2.
In risk assessment, the moment‐independent sensitivity analysis (SA) technique for reducing the model uncertainty has attracted a great deal of attention from analysts and practitioners. It aims at measuring the relative importance of an individual input, or a set of inputs, in determining the uncertainty of model output by looking at the entire distribution range of model output. In this article, along the lines of Plischke et al., we point out that the original moment‐independent SA index (also called delta index) can also be interpreted as the dependence measure between model output and input variables, and introduce another moment‐independent SA index (called extended delta index) based on copula. Then, nonparametric methods for estimating the delta and extended delta indices are proposed. Both methods need only a set of samples to compute all the indices; thus, they conquer the problem of the “curse of dimensionality.” At last, an analytical test example, a risk assessment model, and the levelE model are employed for comparing the delta and the extended delta indices and testing the two calculation methods. Results show that the delta and the extended delta indices produce the same importance ranking in these three test examples. It is also shown that these two proposed calculation methods dramatically reduce the computational burden.  相似文献   

3.
Measures of sensitivity and uncertainty have become an integral part of risk analysis. Many such measures have a conditional probabilistic structure, for which a straightforward Monte Carlo estimation procedure has a double‐loop form. Recently, a more efficient single‐loop procedure has been introduced, and consistency of this procedure has been demonstrated separately for particular measures, such as those based on variance, density, and information value. In this work, we give a unified proof of single‐loop consistency that applies to any measure satisfying a common rationale. This proof is not only more general but invokes less restrictive assumptions than heretofore in the literature, allowing for the presence of correlations among model inputs and of categorical variables. We examine numerical convergence of such an estimator under a variety of sensitivity measures. We also examine its application to a published medical case study.  相似文献   

4.
Attributing foodborne illnesses to food sources is essential to conceive, prioritize, and assess the impact of public health policy measures. The Bayesian microbial subtyping attribution model by Hald et al. is one of the most advanced approaches to attribute sporadic cases; it namely allows taking into account the level of exposure to the sources and the differences between bacterial types and between sources. This step forward requires introducing type and source‐dependent parameters, and generates overparameterization, which was addressed in Hald's paper by setting some parameters to constant values. We question the impact of the choices made for the parameterization (parameters set and values used) on model robustness and propose an alternative parameterization for the Hald model. We illustrate this analysis with the 2005 French data set of non‐typhi Salmonella. Mullner's modified Hald model and a simple deterministic model were used to compare the results and assess the accuracy of the estimates. Setting the parameters for bacterial types specific to a unique source instead of the most frequent one and using data‐based values instead of arbitrary values enhanced the convergence and adequacy of the estimates and led to attribution estimates consistent with the other models’ results. The type and source parameters estimates were also coherent with Mullner's model estimates. The model appeared to be highly sensitive to parameterization. The proposed solution based on specific types and data‐based values improved the robustness of estimates and enabled the use of this highly valuable tool successfully with the French data set.  相似文献   

5.
In a quantitative model with uncertain inputs, the uncertainty of the output can be summarized by a risk measure. We propose a sensitivity analysis method based on derivatives of the output risk measure, in the direction of model inputs. This produces a global sensitivity measure, explicitly linking sensitivity and uncertainty analyses. We focus on the case of distortion risk measures, defined as weighted averages of output percentiles, and prove a representation of the sensitivity measure that can be evaluated on a Monte Carlo sample, as a weighted average of gradients over the input space. When the analytical model is unknown or hard to work with, nonparametric techniques are used for gradient estimation. This process is demonstrated through the example of a nonlinear insurance loss model. Furthermore, the proposed framework is extended in order to measure sensitivity to constant model parameters, uncertain statistical parameters, and random factors driving dependence between model inputs.  相似文献   

6.
In a job shop, because of large setup times, each operation is assigned to only one machine. There is no alternative routing. In a flexible manufacturing system, each manufacturing operation can often be performed on several machines. Therefore, with automated equipment, the capacity of a machine to perform certain operations is not independent of the capacity of other machines. Often, however, operations managers can use a route‐independent answer to production planning questions. For example, how much can be produced of a certain part type and when are important capacity questions in business negotiations, when the detailed routing and scheduling are not yet of interest or cannot be known. This paper provides a mathematical model for the route‐independent analysis of the capacity of flexible manufacturing systems based on a concept of operation types. An example is provided both to illustrate the use of operation types and to highlight the differences between the traditional route‐dependent and the proposed route‐independent formulations of capacity constraints. Some computational results are also given. Finally, a sensitivity analysis is developed to analyze the feasibility of production plans when production requirements and machine capacities can change.  相似文献   

7.
In this study, a variance‐based global sensitivity analysis method was first applied to a contamination assessment model of Listeria monocytogenes in cold smoked vacuum packed salmon at consumption. The impact of the choice of the modeling approach (populational or cellular) of the primary and secondary models as well as the effect of their associated input factors on the final contamination level was investigated. Results provided a subset of important factors, including the food water activity, its storage temperature, and duration in the domestic refrigerator. A refined sensitivity analysis was then performed to rank the important factors, tested over narrower ranges of variation corresponding to their current distributions, using three techniques: ANOVA, Spearman correlation coefficient, and partial least squares regression. Finally, the refined sensitivity analysis was used to rank the important factors.  相似文献   

8.
Typically, the uncertainty affecting the parameters of physiologically based pharmacokinetic (PBPK) models is ignored because it is not currently practical to adjust their values using classical parameter estimation techniques. This issue of parametric variability in a physiological model of benzene pharmacokinetics is addressed in this paper. Monte Carlo simulations were used to study the effects on the model output arising from variability in its parameters. The output was classified into two categories, depending on whether the output of the model on a particular run was judged to be generally consistent with published experimental data. Statistical techniques were used to examine sensitivity and interaction in the parameter space. The model was evaluated against the data from three different experiments in order to test for the structural adequacy of the model and the consistency of the experimental results. The regions of the parameter space associated with various inhalation and gavage experiments are distinct, and the model as presently structured cannot adequately represent the outcomes of all experiments. Our results suggest that further effort is required to discern between the structural adequacy of the model and the consistency of the experimental results. The impact of our results on the risk assessment process for benzene is also examined.  相似文献   

9.
连续生产模式下的不常用备件联合采购优化分析   总被引:8,自引:3,他引:8  
针对连续性生产企业不常用备件管理缺货费用难以确定的实际情况,讨论了服务水平约束的基于随机连续盘点策略的联合补充(s,C,S)随机库存问题,经与单独补充策略相比,发现其经济效益明显。同时对影响备件联合采购费用节约的因素进行敏感性分析,指出了进行此类备件库存优化的方向。  相似文献   

10.
In this work, we introduce a generalized rationale for local sensitivity analysis (SA) methods that allows to solve the problems connected with input constraints. Several models in use in the risk analysis field are characterized by the presence of deterministic relationships among the input parameters. However, SA issues related to the presence of constraints have been mainly dealt with in a heuristic fashion. We start with a systematic analysis of the effects of constraints. The findings can be summarized in the following three effects. (i) Constraints make it impossible to vary one parameter while keeping all others fixed. (ii) The model output becomes insensitive to a parameter if a constraint is solved for that parameter. (iii) Sensitivity analysis results depend on which parameter is selected as dependent. The explanation of these effects is found by proposing a result that leads to a natural extension of the local SA rationale introduced in Helton (1993) . We then extend the definitions of the Birnbaum, criticality, and the differential importance measures to the constrained case. In addition, a procedure is introduced that allows to obtain constrained sensitivity results at the same cost as in the absence of constraints. The application to a nonbinary event tree concludes the article, providing a numerical illustration of the above findings.  相似文献   

11.
The parameters in a physiologically based pharmacokinetic (PBPK) model of methylene chloride were varied systematically, and the resulting variation in a number of model outputs was determined as a function of time for mice and humans at several exposure concentrations. The importance of the various parameters in the model was highly dependent on the conditions (concentration, species) for which the simulation was performed and the model output (dose surrogate) being considered. Model structure also had a significant impact on the results. For sensitivity analysis, particular attention must be paid to conservation equations to ensure that the variational calculations do not alter mass balance, introducing extraneous effects into the model. All of the normalized sensitivity coefficients calculated in this study ranged between −1.12 and 1, and most were much less than 1 in absolute value, indicating that individual input errors are not greatly amplified in the outputs. In addition to ranking parameters in terms of their impact on model predictions, time-dependent sensitivity analysis can also be used as an aid in the design of experiments to estimate parameters by predicting the experimental conditions and sampling points which will maximize parameter identifiability.  相似文献   

12.
In the binary single constraint Knapsack Problem, denoted KP, we are given a knapsack of fixed capacity c and a set of n items. Each item j, j = 1,...,n, has an associated size or weight wj and a profit pj. The goal is to determine whether or not item j, j = 1,...,n, should be included in the knapsack. The objective is to maximize the total profit without exceeding the capacity c of the knapsack. In this paper, we study the sensitivity of the optimum of the KP to perturbations of either the profit or the weight of an item. We give approximate and exact interval limits for both cases (profit and weight) and propose several polynomial time algorithms able to reach these interval limits. The performance of the proposed algorithms are evaluated on a large number of problem instances.  相似文献   

13.
Introduction of classical swine fever virus (CSFV) is a continuing threat to the pig production sector in the European Union. A scenario tree model was developed to obtain more insight into the main risk factors determining the probability of CSFV introduction (P(CSFV)). As this model contains many uncertain input parameters, sensitivity analysis was used to indicate which of these parameters influence model results most. Group screening combined with the statistical techniques of design of experiments and meta-modeling was applied to detect the most important uncertain input parameters among a total of 257 parameters. The response variable chosen was the annual P(CSFV) into the Netherlands. Only 128 scenario calculations were needed to specify the final meta-model. A consecutive one-at-a-time sensitivity analysis was performed with the main effects of this meta-model to explore their impact on the ranking of risk factors contributing most to the annual P(CSFV). The results indicated that model outcome is most sensitive to the uncertain input parameters concerning the expected number of classical swine fever epidemics in Germany, Belgium, and the United Kingdom and the probability that CSFV survives in an empty livestock truck traveling over a distance of 0-900 km.  相似文献   

14.
Quantitative models support investigators in several risk analysis applications. The calculation of sensitivity measures is an integral part of this analysis. However, it becomes a computationally challenging task, especially when the number of model inputs is large and the model output is spread over orders of magnitude. We introduce and test a new method for the estimation of global sensitivity measures. The new method relies on the intuition of exploiting the empirical cumulative distribution function of the simulator output. This choice allows the estimators of global sensitivity measures to be based on numbers between 0 and 1, thus fighting the curse of sparsity. For density-based sensitivity measures, we devise an approach based on moving averages that bypasses kernel-density estimation. We compare the new method to approaches for calculating popular risk analysis global sensitivity measures as well as to approaches for computing dependence measures gathering increasing interest in the machine learning and statistics literature (the Hilbert–Schmidt independence criterion and distance covariance). The comparison involves also the number of operations needed to obtain the estimates, an aspect often neglected in global sensitivity studies. We let the estimators undergo several tests, first with the wing-weight test case, then with a computationally challenging code with up to ◂,▸k=30,000 inputs, and finally with the traditional Level E benchmark code.  相似文献   

15.
Criteria to protect aquatic life are intended to protect diverse ecosystems, but in practice are usually developed from compilations of single‐species toxicity tests using standard test organisms that were tested in laboratory environments. Species sensitivity distributions (SSDs) developed from these compilations are extrapolated to set aquatic ecosystem criteria. The protectiveness of the approach was critically reviewed with a chronic SSD for cadmium comprising 27 species within 21 genera. Within the data set, one genus had lower cadmium effects concentrations than the SSD fifth percentile‐based criterion, so in theory this genus, the amphipod Hyalella, could be lost or at least allowed some level of harm by this criteria approach. However, population matrix modeling projected only slightly increased extinction risks for a temperate Hyalella population under scenarios similar to the SSD fifth percentile criterion. The criterion value was further compared to cadmium effects concentrations in ecosystem experiments and field studies. Generally, few adverse effects were inferred from ecosystem experiments at concentrations less than the SSD fifth percentile criterion. Exceptions were behavioral impairments in simplified food web studies. No adverse effects were apparent in field studies under conditions that seldom exceeded the criterion. At concentrations greater than the SSD fifth percentile, the magnitudes of adverse effects in the field studies were roughly proportional to the laboratory‐based fraction of species with adverse effects in the SSD. Overall, the modeling and field validation comparisons of the chronic criterion values generally supported the relevance and protectiveness of the SSD fifth percentile approach with cadmium.  相似文献   

16.
In risk analysis problems, the decision‐making process is supported by the utilization of quantitative models. Assessing the relevance of interactions is an essential information in the interpretation of model results. By such knowledge, analysts and decisionmakers are able to understand whether risk is apportioned by individual factor contributions or by their joint action. However, models are oftentimes large, requiring a high number of input parameters, and complex, with individual model runs being time consuming. Computational complexity leads analysts to utilize one‐parameter‐at‐a‐time sensitivity methods, which prevent one from assessing interactions. In this work, we illustrate a methodology to quantify interactions in probabilistic safety assessment (PSA) models by varying one parameter at a time. The method is based on a property of the functional ANOVA decomposition of a finite change that allows to exactly determine the relevance of factors when considered individually or together with their interactions with all other factors. A set of test cases illustrates the technique. We apply the methodology to the analysis of the core damage frequency of the large loss of coolant accident of a nuclear reactor. Numerical results reveal the nonadditive model structure, allow to quantify the relevance of interactions, and to identify the direction of change (increase or decrease in risk) implied by individual factor variations and by their cooperation.  相似文献   

17.
Sensitivity analysis (SA) methods are a valuable tool for identifying critical control points (CCPs), which is one of the important steps in the hazard analysis and CCP approach that is used to ensure safe food. There are many SA methods used across various disciplines. Furthermore, food safety process risk models pose challenges because they often are highly nonlinear, contain thresholds, and have discrete inputs. Therefore, it is useful to compare and evaluate SA methods based upon applications to an example food safety risk model. Ten SA methods were applied to a draft Vibrio parahaemolyticus (Vp) risk assessment model developed by the Food and Drug Administration. The model was modified so that all inputs were independent. Rankings of key inputs from different methods were compared. Inputs such as water temperature, number of oysters per meal, and the distributional assumption for the unrefrigerated time were the most important inputs, whereas time on water, fraction of pathogenic Vp, and the distributional assumption for the weight of oysters were the least important inputs. Most of the methods gave a similar ranking of key inputs even though the methods differed in terms of being graphical, mathematical, or statistical, accounting for individual effects or joint effect of inputs, and being model dependent or model independent. A key recommendation is that methods be further compared by application on different and more complex food safety models. Model independent methods, such as ANOVA, mutual information index, and scatter plots, are expected to be more robust than others evaluated.  相似文献   

18.
We study the sensitivity of investment to cash flow conditional on measures of q in an adjustment costs framework with costly external finance. We present a benchmark model in which this conditional investment–cash flow sensitivity increases monotonically with the cost premium for external finance, for firms in a financially constrained regime. Using simulated data, we show that this pattern is found in linear regressions that relate investment rates to measures of both cash flow and average q. We also derive a structural equation for investment from the first‐order conditions of our model, and show that this can be estimated directly.  相似文献   

19.
本文综合考虑联合补货与配送决策,研究了随机需求、允许缺货环境下多企业多产品联合补货与配送集成优化模型,设计了混合差分进化算法(Hybrid Differential Evolution, HDE)对该模型进行求解,同时通过算例与遗传算法、标准的DE算法进行了比较,证实HDE算法高效且稳定;另外,设计了一个先补货再配送的两阶段优化模型,对比优化结果发现采用供应链协同时补货成本较高,配送成本较低,且总成本较低。最后,对相关参数进行了敏感性分析,发现需求率和库存维持成本的变动对总成本的影响远远大过次要订货成本对总成本的影响。  相似文献   

20.
We have studied the sensitivity of health impacts from nuclear reactor accidents, as predicted by the CRAC2 computer code, to the following sources of uncertainty: (1) the model for plume rise, (2) the model for wet deposition, (3) the meteorological bin-sampling procedure for selecting weather sequences with rain, (4) the dose conversion factors for inhalation as affected by uncertainties in the particle size of the carrier aerosol and the clearance rates of radionuclides from the respiratory tract, (5) the weathering half-time for external ground-surface exposure, and (6) the transfer coefficients for terrestrial foodchain pathways. Predicted health impacts usually showed little sensitivity to use of an alternative plume-rise model or a modified rain-bin structure in bin-sampling. Health impacts often were quite sensitive to use of an alternative wet-deposition model in single-trial runs with rain during plume passage, but were less sensitive to the model in bin-sampling runs. Uncertainties in the inhalation dose conversion factors had important effects on early injuries in single-trial runs. Latent cancer fatalities were moderately sensitive to uncertainties in the weathering half-time for ground-surface exposure, but showed little sensitivity to the transfer coefficients for terrestrial foodchain pathways. Sensitivities of CRAC2 predictions to uncertainties in the models and parameters also depended on the magnitude of the source term, and some of the effects on early health effects were comparable to those that were due only to selection of different sets of weather sequences in bin-sampling.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号