首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A wide range of uncertainties will be introduced inevitably during the process of performing a safety assessment of engineering systems. The impact of all these uncertainties must be addressed if the analysis is to serve as a tool in the decision-making process. Uncertainties present in the components (input parameters of model or basic events) of model output are propagated to quantify its impact in the final results. There are several methods available in the literature, namely, method of moments, discrete probability analysis, Monte Carlo simulation, fuzzy arithmetic, and Dempster-Shafer theory. All the methods are different in terms of characterizing at the component level and also in propagating to the system level. All these methods have different desirable and undesirable features, making them more or less useful in different situations. In the probabilistic framework, which is most widely used, probability distribution is used to characterize uncertainty. However, in situations in which one cannot specify (1) parameter values for input distributions, (2) precise probability distributions (shape), and (3) dependencies between input parameters, these methods have limitations and are found to be not effective. In order to address some of these limitations, the article presents uncertainty analysis in the context of level-1 probabilistic safety assessment (PSA) based on a probability bounds (PB) approach. PB analysis combines probability theory and interval arithmetic to produce probability boxes (p-boxes), structures that allow the comprehensive propagation through calculation in a rigorous way. A practical case study is also carried out with the developed code based on the PB approach and compared with the two-phase Monte Carlo simulation results.  相似文献   

2.
Terje Aven 《Risk analysis》2010,30(3):354-360
It is common perspective in risk analysis that there are two kinds of uncertainties: i) variability as resulting from heterogeneity and stochasticity (aleatory uncertainty) and ii) partial ignorance or epistemic uncertainties resulting from systematic measurement error and lack of knowledge. Probability theory is recognized as the proper tool for treating the aleatory uncertainties, but there are different views on what is the best approach for describing partial ignorance and epistemic uncertainties. Subjective probabilities are often used for representing this type of ignorance and uncertainties, but several alternative approaches have been suggested, including interval analysis, probability bound analysis, and bounds based on evidence theory. It is argued that probability theory generates too precise results when the background knowledge of the probabilities is poor. In this article, we look more closely into this issue. We argue that this critique of probability theory is based on a conception of risk assessment being a tool to objectively report on the true risk and variabilities. If risk assessment is seen instead as a method for describing the analysts’ (and possibly other stakeholders’) uncertainties about unknown quantities, the alternative approaches (such as the interval analysis) often fail in providing the necessary decision support.  相似文献   

3.
In risk analysis, the treatment of the epistemic uncertainty associated to the probability of occurrence of an event is fundamental. Traditionally, probabilistic distributions have been used to characterize the epistemic uncertainty due to imprecise knowledge of the parameters in risk models. On the other hand, it has been argued that in certain instances such uncertainty may be best accounted for by fuzzy or possibilistic distributions. This seems the case in particular for parameters for which the information available is scarce and of qualitative nature. In practice, it is to be expected that a risk model contains some parameters affected by uncertainties that may be best represented by probability distributions and some other parameters that may be more properly described in terms of fuzzy or possibilistic distributions. In this article, a hybrid method that jointly propagates probabilistic and possibilistic uncertainties is considered and compared with pure probabilistic and pure fuzzy methods for uncertainty propagation. The analyses are carried out on a case study concerning the uncertainties in the probabilities of occurrence of accident sequences in an event tree analysis of a nuclear power plant.  相似文献   

4.
A simple and useful characterization of many predictive models is in terms of model structure and model parameters. Accordingly, uncertainties in model predictions arise from uncertainties in the values assumed by the model parameters (parameter uncertainty) and the uncertainties and errors associated with the structure of the model (model uncertainty). When assessing uncertainty one is interested in identifying, at some level of confidence, the range of possible and then probable values of the unknown of interest. All sources of uncertainty and variability need to be considered. Although parameter uncertainty assessment has been extensively discussed in the literature, model uncertainty is a relatively new topic of discussion by the scientific community, despite being often the major contributor to the overall uncertainty. This article describes a Bayesian methodology for the assessment of model uncertainties, where models are treated as sources of information on the unknown of interest. The general framework is then specialized for the case where models provide point estimates about a single‐valued unknown, and where information about models are available in form of homogeneous and nonhomogeneous performance data (pairs of experimental observations and model predictions). Several example applications for physical models used in fire risk analysis are also provided.  相似文献   

5.
In this work, we study the effect of epistemic uncertainty in the ranking and categorization of elements of probabilistic safety assessment (PSA) models. We show that, while in a deterministic setting a PSA element belongs to a given category univocally, in the presence of epistemic uncertainty, a PSA element belongs to a given category only with a certain probability. We propose an approach to estimate these probabilities, showing that their knowledge allows to appreciate " the sensitivity of component categorizations to uncertainties in the parameter values " (U.S. NRC Regulatory Guide 1.174). We investigate the meaning and utilization of an assignment method based on the expected value of importance measures. We discuss the problem of evaluating changes in quality assurance, maintenance activities prioritization, etc. in the presence of epistemic uncertainty. We show that the inclusion of epistemic uncertainly in the evaluation makes it necessary to evaluate changes through their effect on PSA model parameters. We propose a categorization of parameters based on the Fussell-Vesely and differential importance (DIM) measures. In addition, issues in the calculation of the expected value of the joint importance measure are present when evaluating changes affecting groups of components. We illustrate that the problem can be solved using DIM. A numerical application to a case study concludes the work.  相似文献   

6.
A recent report by the National Academy of Sciences estimates that the radiation dose to the bronchial epithelium, per working level month (WLM) of radon daughter exposure, is about 30% lower for residential exposures than for exposures received in underground mines. Adjusting the previously published BEIR IV radon risk model accordingly, the unit risk for indoor exposures of the general population is about 2.2 x 10(-4) lung cancer deaths (lcd)/WLM. Using results from EPA's National Residential Radon Survey, the average radon level is estimated to be about 1.25 pCi/L, and the annual average exposure about 0.242 WLM. Based on these estimates, 13,600 radon-induced lcd/yr are projected for the United States. A quantitative uncertainty analysis was performed, which considers: statistical uncertainties in the epidemiological studies of radon-exposed miners; the dependence of risk on age at, and time since, exposure; the extrapolation of risk estimates from mines to homes based on comparative dosimetry; and uncertainties in the radon daughter levels in homes and in the average residential occupancy. Based on this assessment of the uncertainties in the unit risk and exposure estimates, an uncertainty range of 7000-30000 lcd/yr is derived.  相似文献   

7.
There are many uncertainties in a probabilistic risk analysis (PRA). We identify the different types of uncertainties and describe their implications. We then summarize the uncertainty analyses which have performed in current PRAs and characterize results which have been obtained. We draw conclusions regarding interpretations of uncertainties, areas having largest uncertainties, and needs which exist in uncertainty analysis. We finally characterize the robustness of various utilizations of PRA results.  相似文献   

8.
The treatment of uncertainties associated with modeling and risk assessment has recently attracted significant attention. The methodology and guidance for dealing with parameter uncertainty have been fairly well developed and quantitative tools such as Monte Carlo modeling are often recommended. However, the issue of model uncertainty is still rarely addressed in practical applications of risk assessment. The use of several alternative models to derive a range of model outputs or risks is one of a few available techniques. This article addresses the often-overlooked issue of what we call "modeler uncertainty," i.e., difference in problem formulation, model implementation, and parameter selection originating from subjective interpretation of the problem at hand. This study uses results from the Fruit Working Group, which was created under the International Atomic Energy Agency (IAEA) BIOMASS program (BIOsphere Modeling and ASSessment). Model-model and model-data intercomparisons reviewed in this study were conducted by the working group for a total of three different scenarios. The greatest uncertainty was found to result from modelers' interpretation of scenarios and approximations made by modelers. In scenarios that were unclear for modelers, the initial differences in model predictions were as high as seven orders of magnitude. Only after several meetings and discussions about specific assumptions did the differences in predictions by various models merge. Our study shows that parameter uncertainty (as evaluated by a probabilistic Monte Carlo assessment) may have contributed over one order of magnitude to the overall modeling uncertainty. The final model predictions ranged between one and three orders of magnitude, depending on the specific scenario. This study illustrates the importance of problem formulation and implementation of an analytic-deliberative process in risk characterization.  相似文献   

9.
I use an analogy with the history of physical measurements, population and energy projections, and analyze the trends in several data sets to quantify the overconfidence of the experts in the reliability of their uncertainty estimates. Data sets include (i) time trends in the sequential measurements of the same physical quantity; (ii) national population projections; and (iii) projections for the U.S., energy sector. Probabilities of large deviations for the true values are parametrized by an exponential distribution with the slope determined by the data. Statistics of past errors can be used in probabilistic risk assessment to hedge against unsuspected uncertainties and to include the possibility of human error into the framework of uncertainty analysis. By means of a sample Monte Carlo simulation of cancer risk caused by ingestion of benzene in soil, I demonstrate how the upper 95th percentiles of risk are changed when unsuspected uncertainties are included. I recommend to inflate the estimated uncertainties by default safety factors determined from the relevant historical data sets.  相似文献   

10.
Projecting losses associated with hurricanes is a complex and difficult undertaking that is wrought with uncertainties. Hurricane Charley, which struck southwest Florida on August 13, 2004, illustrates the uncertainty of forecasting damages from these storms. Due to shifts in the track and the rapid intensification of the storm, real-time estimates grew from 2 to 3 billion dollars in losses late on August 12 to a peak of 50 billion dollars for a brief time as the storm appeared to be headed for the Tampa Bay area. The storm hit the resort areas of Charlotte Harbor near Punta Gorda and then went on to Orlando in the central part of the state, with early poststorm estimates converging on a damage estimate in the 28 to 31 billion dollars range. Comparable damage to central Florida had not been seen since Hurricane Donna in 1960. The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) has recognized the role of computer models in projecting losses from hurricanes. The FCHLPM established a professional team to perform onsite (confidential) audits of computer models developed by several different companies in the United States that seek to have their models approved for use in insurance rate filings in Florida. The team's members represent the fields of actuarial science, computer science, meteorology, statistics, and wind and structural engineering. An important part of the auditing process requires uncertainty and sensitivity analyses to be performed with the applicant's proprietary model. To influence future such analyses, an uncertainty and sensitivity analysis has been completed for loss projections arising from use of a Holland B parameter hurricane wind field model. Uncertainty analysis quantifies the expected percentage reduction in the uncertainty of wind speed and loss that is attributable to each of the input variables.  相似文献   

11.
12.
Yakov Ben‐Haim 《Risk analysis》2012,32(10):1638-1646
Risk analysis is challenged in three ways by uncertainty. Our understanding of the world and its uncertainties is evolving; indeterminism is an inherent part of the open universe in which we live; and learning from experience involves untestable assumptions. We discuss several concepts of robustness as tools for responding to these epistemological challenges. The use of models is justified, even though they are known to err. A concept of robustness is illustrated in choosing between a conventional technology and an innovative, promising, but more uncertain technology. We explain that nonprobabilistic robust decisions are sometimes good probabilistic bets. Info‐gap and worst‐case concepts of robustness are compared. Finally, we examine the exploitation of favorable but uncertain opportunities and its relation to robust decision making.  相似文献   

13.
Terje Aven 《Risk analysis》2013,33(12):2082-2091
Recently, several authors have presented interesting contributions on how to meet deep or severe uncertainties in a risk analysis setting. In this article, we provide some reflections on some of the foundational pillars that this work is based on, including the meaning of concepts such as deep uncertainty, known probabilities, and correct models, the aim being to contribute to a strengthening of the scientific platform of the work, as well as providing new insights on how to best implement management policies meeting these uncertainties. We also provide perspectives on the boundaries and limitations of analytical approaches for supporting decision making in cases of deep uncertainties. A main conclusion of the article is that deep uncertainties call for managerial review and judgment that sees beyond the analytical frameworks studied in risk assessment and risk management contexts, including those now often suggested to be used, such as robust optimization techniques. This managerial review and judgment should be seen as a basic element of the risk management.  相似文献   

14.
Probabilistic risk analysis (PRA) can be an effective tool to assess risks and uncertainties and to set priorities among safety policy options. Based on systems analysis and Bayesian probability, PRA has been applied to a wide range of cases, three of which are briefly presented here: the maintenance of the tiles of the space shuttle, the management of patient risk in anesthesia, and the choice of seismic provisions of building codes for the San Francisco Bay Area. In the quantification of a risk, a number of problems arise in the public sector where multiple stakeholders are involved. In this article, I describe different approaches to the treatments of uncertainties in risk analysis, their implications for risk ranking, and the role of risk analysis results in the context of a safety decision process. I also discuss the implications of adopting conservative hypotheses before proceeding to what is, in essence, a conditional uncertainty analysis, and I explore some implications of different levels of "conservatism" for the ranking of risk mitigation measures.  相似文献   

15.
Scott Jordan 《决策科学》1988,19(3):672-681
Production lines are often modeled as queueing networks with finite inventory between each stage. Little is known, however, about the average production rate and inventory levels when the service distribution at each stage is normal. This paper approximates the service distribution using iterative methods rather than simulation. The results show that iterative methods are useful when the problem is small and that approximation of the service distribution, by another distribution with the same mean and variance, is valid for steady-state results such as average production rate or average inventory level.  相似文献   

16.
This article discusses recent experiences with the Numeral Unit Spread Assessment Pedigree (NUSAP) system for multidimensional uncertainty assessment, based on four case studies that vary in complexity. We show that the NUSAP method is applicable not only to relatively simple calculation schemes but also to complex models in a meaningful way and that NUSAP is useful to assess not only parameter uncertainty but also (model) assumptions. A diagnostic diagram can be used to synthesize results of quantitative analysis of parameter sensitivity and qualitative review (pedigree analysis) of parameter strength. It provides an analytic tool to prioritize uncertainties according to quantitative and qualitative insights in the limitations of available knowledge. We show that extension of the pedigree scheme to include societal dimensions of uncertainty, such as problem framing and value-laden assumptions, further promotes reflexivity and collective learning. When used in a deliberative setting, NUSAP pedigree assessment has the potential to foster a deeper social debate and a negotiated management of complex environmental problems.  相似文献   

17.
针对碳交易政策下的多式联运路径选择问题,考虑运输时间和单位运费率不确定且其概率分布未知的情况,引入鲁棒优化建模方法对其进行研究。首先利用box不确定集合刻画分布未知的运输时间和运费率,然后在碳交易政策下确定模型的基础上,构建鲁棒性可调节的多式联运路径选择模型,并通过对偶转化得到相对易求解的鲁棒等价模型。实例分析表明,鲁棒模型能较好地处理参数概率分布未知的多式联运路径选择问题,方便决策者根据偏好调整不确定预算水平进行决策。运输时间和单位运费率的不确定性都会影响多式联运路径决策,但是作用机理有所不同。将上述碳交易政策下的模型拓展到其他低碳政策,结果表明多种低碳政策的组合能更好实现多式联运减排。  相似文献   

18.
Many different techniques have been proposed for performing uncertainty and sensitivity analyses on computer models for complex processes. The objective of the present study is to investigate the applicability of three widely used techniques to three computer models having large uncertainties and varying degrees of complexity in order to highlight some of the problem areas that must be addressed in actual applications. The following approaches to uncertainty and sensitivity analysis are considered: (1) response surface methodology based on input determined from a fractional factorial design; (2) Latin hypercube sampling with and without regression analysis; and (3) differential analysis. These techniques are investigated with respect to (1) ease of implementation, (2) flexibility, (3) estimation of the cumulative distribution function of the output, and (4) adaptability to different methods of sensitivity analysis. With respect to these criteria, the technique using Latin hypercube sampling and regression analysis had the best overall performance. The models used in the investigation are well documented, thus making it possible for researchers to make comparisons of other techniques with the results in this study.  相似文献   

19.
A probabilistic risk analysis (PRA) for a high-level radioactive waste repository is very important since it gives an estimate of its health impacts, allowing comparisons to be made with the health impacts of competing technologies. However, it is extremely difficult to develop a credible PRA for a specific repository site because of large uncertainties in future climate, hydrology, geological processes, etc. At best, such a PRA would not be understandable to the public. An alternative proposed here is to develop a PRA for an average U.S. site, taking all properties of the site to be the U.S. average. The results are equivalent to the average results for numerous randomly selected sites. Such a PRA is presented here; it is easy to understand, and it is not susceptible to substantial uncertainty. Applying the results to a specific repository site then requires only a simple, intuitively acceptable "leap of faith" in assuming that with large expenditures of effort and money, experts can select a site that would be at least as secure as a randomly selected site.  相似文献   

20.
目前,加权平均法是一种比较常见的满意度调查结果的汇总方法,但是这种方法的前提条件是决策者的偏好结构满足加性独立条件,否则需要采用非线性综合方法。本文旨在考虑决策者偏好不满足加性独立条件下,将用户满意度抽样调查过程中产生的置信度和置信区间与调查问卷中的用户不确定的评价结果统一进行考虑,并采用mass函数值为区间数的证据推理方法分析基于抽样调查得到的以置信区间表示的用户满意度调查的结果综合问题。最后以某网络信息中心用户满意度调查为例展开实证分析。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号