首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The bounding analysis methodology described by Ha-Duong et al. (this issue) is logically incomplete and invites serious misuse and misinterpretation, as their own example and interpretation illustrate. A key issue is the extent to which these problems are inherent in their methodology, and resolvable by a logically complete assessment (such as Monte Carlo or Bayesian risk assessment), as opposed to being general problems in any risk-assessment methodology. I here attempt to apportion the problems between those inherent in the proposed bounding analysis and those that are more general, such as reliance on questionable expert elicitations. I conclude that the specific methodology of Ha-Duong et al. suffers from logical gaps in the definition and construction of inputs, and hence should not be used in the form proposed. Furthermore, the labor required to do a sound bounding analysis is great enough so that one may as well skip that analysis and carry out a more logically complete probabilistic analysis, one that will better inform the consumer of the appropriate level uncertainty. If analysts insist on carrying out a bounding analysis in place of more thorough assessments, extensive analyses of sensitivity to inputs and assumptions will be essential to display uncertainties, arguably more essential than it would be in full probabilistic analyses.  相似文献   

2.
The field of comparative risk analysis of electrical energy alternatives has traditionally been plagued by highly uncertain estimates of risk rates, and consequently by conflicting judgements of relative risk. To the extent that this uncertainty arises from traditional sources–imperfect observations or actual variance in the data–it can be brought within a Bayesian statistical framework which allows policy conclusions to be formulted and tested at different levels of confidence. It is shown that there are important methodological or "artifactual" sources of uncertainty, however, that cannot be treated by statistical means; these require conceptual advances for their resolution. By identifying these sources of uncertainty in simple thought experiments and examples, it is shown in what ways the concept of attributable risk, which is the policy-maker's chief concern, must be sharpened and refined to have unambiguous meaning. The conventional "multilinear" formula for calculating risk indices is challenged as a measure of attributable risk, and directions for further research to improve the methodological foundations of comparative risk analysis are identified.  相似文献   

3.
This article demonstrates application of sensitivity analysis to risk assessment models with two-dimensional probabilistic frameworks that distinguish between variability and uncertainty. A microbial food safety process risk (MFSPR) model is used as a test bed. The process of identifying key controllable inputs and key sources of uncertainty using sensitivity analysis is challenged by typical characteristics of MFSPR models such as nonlinearity, thresholds, interactions, and categorical inputs. Among many available sensitivity analysis methods, analysis of variance (ANOVA) is evaluated in comparison to commonly used methods based on correlation coefficients. In a two-dimensional risk model, the identification of key controllable inputs that can be prioritized with respect to risk management is confounded by uncertainty. However, as shown here, ANOVA provided robust insights regarding controllable inputs most likely to lead to effective risk reduction despite uncertainty. ANOVA appropriately selected the top six important inputs, while correlation-based methods provided misleading insights. Bootstrap simulation is used to quantify uncertainty in ranks of inputs due to sampling error. For the selected sample size, differences in F values of 60% or more were associated with clear differences in rank order between inputs. Sensitivity analysis results identified inputs related to the storage of ground beef servings at home as the most important. Risk management recommendations are suggested in the form of a consumer advisory for better handling and storage practices.  相似文献   

4.
Jan F. Van Impe 《Risk analysis》2011,31(8):1295-1307
The aim of quantitative microbiological risk assessment is to estimate the risk of illness caused by the presence of a pathogen in a food type, and to study the impact of interventions. Because of inherent variability and uncertainty, risk assessments are generally conducted stochastically, and if possible it is advised to characterize variability separately from uncertainty. Sensitivity analysis allows to indicate to which of the input variables the outcome of a quantitative microbiological risk assessment is most sensitive. Although a number of methods exist to apply sensitivity analysis to a risk assessment with probabilistic input variables (such as contamination, storage temperature, storage duration, etc.), it is challenging to perform sensitivity analysis in the case where a risk assessment includes a separate characterization of variability and uncertainty of input variables. A procedure is proposed that focuses on the relation between risk estimates obtained by Monte Carlo simulation and the location of pseudo‐randomly sampled input variables within the uncertainty and variability distributions. Within this procedure, two methods are used—that is, an ANOVA‐like model and Sobol sensitivity indices—to obtain and compare the impact of variability and of uncertainty of all input variables, and of model uncertainty and scenario uncertainty. As a case study, this methodology is applied to a risk assessment to estimate the risk of contracting listeriosis due to consumption of deli meats.  相似文献   

5.
Scott Janzwood 《Risk analysis》2023,43(10):2004-2016
Outside of the field of risk analysis, an important theoretical conversation on the slippery concept of uncertainty has unfolded over the last 40 years within the adjacent field of environmental risk. This literature has become increasingly standardized behind the tripartite distinction between uncertainty location, the nature of uncertainty, and uncertainty level, popularized by the “W&H framework.” This article introduces risk theorists and practitioners to the conceptual literature on uncertainty with the goal of catalyzing further development and clarification of the uncertainty concept within the field of risk analysis. It presents two critiques of the W&H framework's dimension of uncertainty level—the dimension that attempts to define the characteristics separating greater uncertainties from lesser uncertainties. First, I argue the framework's conceptualization of uncertainty level lacks a clear and consistent epistemological position and fails to acknowledge or reconcile the tensions between Bayesian and frequentist perspectives present within the framework. This article reinterprets the dimension of uncertainty level from a Bayesian perspective, which understands uncertainty as a mental phenomenon arising from “confidence deficits” as opposed to the ill-defined notion of “knowledge deficits” present in the framework. And second, I elaborate the undertheorized concept of uncertainty “reducibility.” These critiques inform a clarified conceptualization of uncertainty level that can be integrated with risk analysis concepts and usefully applied by modelers and decisionmakers engaged in model-based decision support.  相似文献   

6.
The role of the risk analyst is critical in understanding and managing uncertainty. However, there is another type of uncertainty that is rarely discussed: The legal, social, and reputational liabilities of the risk analyst. Recent events have shown that professionals participating in risk analysis can be held personally liable. It is timely and important to ask: How can risk science guide risk analysis with consideration of those liabilities, particularly in response to emerging and unprecedented risk. This paper studies this topic by: (1) Categorizing how professionals with risk analysis responsibilities have historically been held liable, and (2) developing a framework to address uncertainty related to those potential liabilities. The result of this framework will enable individual analysts and organizations to investigate and manage the expectations of risk analysts and others as they apply risk principles and methods. This paper will be of interest to risk researchers, risk professionals, and industry professionals who seek maturity within their risk programs.  相似文献   

7.
Several major risk studies have been performed in recent years in the maritime transportation domain. These studies have had significant impact on management practices in the industry. The first, the Prince William Sound risk assessment, was reviewed by the National Research Council and found to be promising but incomplete, as the uncertainty in its results was not assessed. The difficulty in assessing this uncertainty is the different techniques that need to be used to model risk in this dynamic and data-scarce application area. In previous articles, we have developed the two pieces of methodology necessary to assess uncertainty in maritime risk assessment, a Bayesian simulation of the occurrence of situations with accident potential and a Bayesian multivariate regression analysis of the relationship between factors describing these situations and expert judgments of accident risk. In this article, we combine the methods to perform a full-scale assessment of risk and uncertainty for two case studies. The first is an assessment of the effects of proposed ferry service expansions in San Francisco Bay. The second is an assessment of risk for the Washington State Ferries, the largest ferry system in the United States.  相似文献   

8.
Methods for Uncertainty Analysis: A Comparative Survey   总被引:1,自引:0,他引:1  
This paper presents a survey and comparative evaluation of methods which have been developed for the determination of uncertainties in accident consequences and probabilities, for use in probabilistic risk assessment. The methods considered are: analytic techniques, Monte Carlo simulation, response surface approaches, differential sensitivity techniques, and evaluation of classical statistical confidence bounds. It is concluded that only the response surface and differential sensitivity approaches are sufficiently general and flexible for use as overall methods of uncertainty analysis in probabilistic risk assessment. The other methods considered, however, are very useful in particular problems.  相似文献   

9.
Behavioral economics has captured the interest of scholars and the general public by demonstrating ways in which individuals make decisions that appear irrational. While increasing attention is being focused on the implications of this research for the design of risk‐reducing policies, less attention has been paid to how it affects the economic valuation of policy consequences. This article considers the latter issue, reviewing the behavioral economics literature and discussing its implications for the conduct of benefit‐cost analysis, particularly in the context of environmental, health, and safety regulations. We explore three concerns: using estimates of willingness to pay or willingness to accept compensation for valuation, considering the psychological aspects of risk when valuing mortality‐risk reductions, and discounting future consequences. In each case, we take the perspective that analysts should avoid making judgments about whether values are “rational” or “irrational.” Instead, they should make every effort to rely on well‐designed studies, using ranges, sensitivity analysis, or probabilistic modeling to reflect uncertainty. More generally, behavioral research has led some to argue for a more paternalistic approach to policy analysis. We argue instead for continued focus on describing the preferences of those affected, while working to ensure that these preferences are based on knowledge and careful reflection.  相似文献   

10.
Bayesian networks (BNs) are graphical modeling tools that are generally recommended for exploring what‐if scenarios, visualizing systems and problems, and for communication between stakeholders during decision making. In this article, we investigate their potential for exploring different perspectives in trade disputes. To do so, we draw on a specific case study that was arbitrated by the World Trade Organization (WTO): the Australia‐New Zealand apples dispute. The dispute centered on disagreement about judgments contained within Australia's 2006 import risk analysis (IRA). We built a range of BNs of increasing complexity that modeled various approaches to undertaking IRAs, from the basic qualitative and semi‐quantitative risk analyses routinely performed in government agencies, to the more complex quantitative simulation undertaken by Australia in the apples dispute. We found the BNs useful for exploring disagreements under uncertainty because they are probabilistic and transparently represent steps in the analysis. Different scenarios and evidence can easily be entered. Specifically, we explore the sensitivity of the risk output to different judgments (particularly volume of trade). Thus, we explore how BNs could usefully aid WTO dispute settlement. We conclude that BNs are preferable to basic qualitative and semi‐quantitative risk analyses because they offer an accessible interface and are mathematically sound. However, most current BN modeling tools are limited compared with complex simulations, as was used in the 2006 apples IRA. Although complex simulations may be more accurate, they are a black box for stakeholders. BNs have the potential to be a transparent aid to complex decision making, but they are currently computationally limited. Recent technological software developments are promising.  相似文献   

11.
In risk assessment, the moment‐independent sensitivity analysis (SA) technique for reducing the model uncertainty has attracted a great deal of attention from analysts and practitioners. It aims at measuring the relative importance of an individual input, or a set of inputs, in determining the uncertainty of model output by looking at the entire distribution range of model output. In this article, along the lines of Plischke et al., we point out that the original moment‐independent SA index (also called delta index) can also be interpreted as the dependence measure between model output and input variables, and introduce another moment‐independent SA index (called extended delta index) based on copula. Then, nonparametric methods for estimating the delta and extended delta indices are proposed. Both methods need only a set of samples to compute all the indices; thus, they conquer the problem of the “curse of dimensionality.” At last, an analytical test example, a risk assessment model, and the levelE model are employed for comparing the delta and the extended delta indices and testing the two calculation methods. Results show that the delta and the extended delta indices produce the same importance ranking in these three test examples. It is also shown that these two proposed calculation methods dramatically reduce the computational burden.  相似文献   

12.
Many different techniques have been proposed for performing uncertainty and sensitivity analyses on computer models for complex processes. The objective of the present study is to investigate the applicability of three widely used techniques to three computer models having large uncertainties and varying degrees of complexity in order to highlight some of the problem areas that must be addressed in actual applications. The following approaches to uncertainty and sensitivity analysis are considered: (1) response surface methodology based on input determined from a fractional factorial design; (2) Latin hypercube sampling with and without regression analysis; and (3) differential analysis. These techniques are investigated with respect to (1) ease of implementation, (2) flexibility, (3) estimation of the cumulative distribution function of the output, and (4) adaptability to different methods of sensitivity analysis. With respect to these criteria, the technique using Latin hypercube sampling and regression analysis had the best overall performance. The models used in the investigation are well documented, thus making it possible for researchers to make comparisons of other techniques with the results in this study.  相似文献   

13.
In a quantitative model with uncertain inputs, the uncertainty of the output can be summarized by a risk measure. We propose a sensitivity analysis method based on derivatives of the output risk measure, in the direction of model inputs. This produces a global sensitivity measure, explicitly linking sensitivity and uncertainty analyses. We focus on the case of distortion risk measures, defined as weighted averages of output percentiles, and prove a representation of the sensitivity measure that can be evaluated on a Monte Carlo sample, as a weighted average of gradients over the input space. When the analytical model is unknown or hard to work with, nonparametric techniques are used for gradient estimation. This process is demonstrated through the example of a nonlinear insurance loss model. Furthermore, the proposed framework is extended in order to measure sensitivity to constant model parameters, uncertain statistical parameters, and random factors driving dependence between model inputs.  相似文献   

14.
Recent work in the assessment of risk in maritime transportation systems has used simulation-based probabilistic risk assessment techniques. In the Prince William Sound and Washington State Ferries risk assessments, the studies' recommendations were backed up by estimates of their impact made using such techniques and all recommendations were implemented. However, the level of uncertainty about these estimates was not available, leaving the decisionmakers unsure whether the evidence was sufficient to assess specific risks and benefits. The first step toward assessing the impact of uncertainty in maritime risk assessments is to model the uncertainty in the simulation models used. In this article, a study of the impact of proposed ferry service expansions in San Francisco Bay is used as a case study to demonstrate the use of Bayesian simulation techniques to propagate uncertainty throughout the analysis. The conclusions drawn in the original study are shown, in this case, to be robust to the inherent uncertainties. The main intellectual merit of this work is the development of Bayesian simulation technique to model uncertainty in the assessment of maritime risk. However, Bayesian simulations have been implemented only as theoretical demonstrations. Their use in a large, complex system may be considered state of the art in the field of computational sciences.  相似文献   

15.
M. C. Kennedy 《Risk analysis》2011,31(10):1597-1609
Two‐dimensional Monte Carlo simulation is frequently used to implement probabilistic risk models, as it allows for uncertainty and variability to be quantified separately. In many cases, we are interested in the proportion of individuals from a variable population exceeding a critical threshold, together with uncertainty about this proportion. In this article we introduce a new method that can accurately estimate these quantities much more efficiently than conventional algorithms. We also show how those model parameters having the greatest impact on the probabilities of rare events can be quickly identified via this method. The algorithm combines elements from well‐established statistical techniques in extreme value theory and Bayesian analysis of computer models. We demonstrate the practical application of these methods with a simple example, in which the true distributions are known exactly, and also with a more realistic model of microbial contamination of milk with seven parameters. For the latter, sensitivity analysis (SA) is shown to identify the two inputs explaining the majority of variation in distribution tail behavior. In the subsequent prediction of probabilities of large contamination events, similar results are obtained using the new approach taking 43 seconds or the conventional simulation that requires more than 3 days.  相似文献   

16.
Currently, there is a trend away from the use of single (often conservative) estimates of risk to summarize the results of risk analyses in favor of stochastic methods which provide a more complete characterization of risk. The use of such stochastic methods leads to a distribution of possible values of risk, taking into account both uncertainty and variability in all of the factors affecting risk. In this article, we propose a general framework for the analysis of uncertainty and variability for use in the commonly encountered case of multiplicative risk models, in which risk may be expressed as a product of two or more risk factors. Our analytical methods facilitate the evaluation of overall uncertainty and variability in risk assessment, as well as the contributions of individual risk factors to both uncertainty and variability which is cumbersome using Monte Carlo methods. The use of these methods is illustrated in the analysis of potential cancer risks due to the ingestion of radon in drinking water.  相似文献   

17.
Comparative risk projects can provide broad policy guidance but they rarely have adequate scientific foundations to support precise risk rankings. Many extant projects report rankings anyway, with limited attention to uncertainty. Stochastic uncertainty, structural uncertainty, and ignorance are types of incertitude that afflict risk comparisons. The recently completed New Jersey Comparative Risk Project was innovative in trying to acknowledge and accommodate some historically ignored uncertainties in a substantive manner. This article examines the methods used and lessons learned from the New Jersey project. Monte Carlo techniques were used to characterize stochastic uncertainty, and sensitivity analysis helped to manage structural uncertainty. A deliberative process and a sorting technique helped manage ignorance. Key findings are that stochastic rankings can be calculated but they reveal such an alarming degree of imprecision that the rankings are no longer useful, whereas sorting techniques are helpful in spite of uncertainty. A deliberative process is helpful to counter analytical overreaching.  相似文献   

18.
The transfer ratio of bacteria from one surface to another is often estimated from laboratory experiments and quantified by dividing the expected number of bacteria on the recipient surface by the expected number of bacteria on the donor surface. Yet, the expected number of bacteria on each surface is uncertain due to the limited number of colonies that are counted and/or samples that can be analyzed. The expected transfer ratio is, therefore, also uncertain and its estimate may exceed 1 if real transfer is close to 100%. In addition, the transferred fractions vary over experiments but it is unclear, using this approach, how to combine uncertainty and variability into one estimate for the transfer ratio. A Bayesian network model was proposed that allows the combination of uncertainty within one experiment and variability over multiple experiments and prevents inappropriate values for the transfer ratio. Model functionality was shown using data from a laboratory experiment in which the transfer of Salmonella was determined from contaminated pork meat to a butcher's knife, and vice versa. Recovery efficiency of bacteria from both surfaces was also determined and accounted for in the analysis. Transfer ratio probability distributions showed a large variability, with a mean value of 0.19 for the transfer of Salmonella from pork meat to the knife and 0.58 for the transfer of Salmonella from the knife to pork meat. The proposed Bayesian model can be used for analyzing data from similar study designs in which uncertainty should be combined with variability.  相似文献   

19.
Three methods for making a consumer product safety decision were evaluated on scales relating to their perceived acceptability, logical soundness, completeness, and sensitivity to moral and ethical concerns. Two of the methods were formalized techniques: cost-benefit analysis and risk analysis. The third method involved abiding by standard industry practices. Other factors in the decision-making context were also varied. The results indicated that formalized techniques were preferred over the standard practices method. Within the formalized methods, cost-benefit analysis was judged less acceptable than a comparable method that did not involve making explicit value tradeoffs. All methods were judged more acceptable when they led to improved product safety. Knowledge of consequences did not exert direct effect on judgments, though it did interact significantly with other variables. The results are discussed in terms of judgmental processes that people apply when evaluating decision methods.  相似文献   

20.
A unique multidisciplinary perspective on the risk literature is used to establish a fresh and provocative argument regarding the epistemological understanding and definition of risk. Building on economic conceptualizations that distinguish risk from uncertainty and argue that risk is an ordered application of knowledge to the unknown, the survey identifies each of the disciplines as having a particular knowledge approach with which they confront the unknown so as to order its randomness and convert it into a risk proposition. This epistemological approach suggests the concept of risk can act as a mirror, reflecting the preoccupations, strengths, and weaknesses of each discipline as they grapple with uncertainty. The conclusion suggests that the different disciplines can, and desirably should, act in concert toward a cumulative appreciation of risk that progresses our understanding of the concept. One way in which the article challenges risk experts to join disciplinary forces in a collaborative effort is to holistically appreciate and articulate the concept of political risk calculation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号