共查询到20条相似文献,搜索用时 15 毫秒
1.
Treatment of Uncertainty in Performance Assessments for Complex Systems 总被引:13,自引:0,他引:13
Jon C. Helton 《Risk analysis》1994,14(4):483-511
When viewed at a high level, performance assessments (PAs) for complex systems involve two types of uncertainty: stochastic uncertainty, which arises because the system under study can behave in many different ways, and subjective uncertainty, which arises from a lack of knowledge about quantities required within the computational implementation of the PA. Stochastic uncertainty is typically incorporated into a PA with an experimental design based on importance sampling and leads to the final results of the PA being expressed as a complementary cumulative distribution function (CCDF). Subjective uncertainty is usually treated with Monte Carlo techniques and leads to a distribution of CCDFs. This presentation discusses the use of the Kaplan/Garrick ordered triple representation for risk in maintaining a distinction between stochastic and subjective uncertainty in PAs for complex systems. The topics discussed include (1) the definition of scenarios and the calculation of scenario probabilities and consequences, (2) the separation of subjective and stochastic uncertainties, (3) the construction of CCDFs required in comparisons with regulatory standards (e.g., 40 CFR Part 191, Subpart B for the disposal of radioactive waste), and (4) the performance of uncertainty and sensitivity studies. Results obtained in a preliminary PA for the Waste Isolation Pilot Plant, an uncertainty and sensitivity analysis of the MACCS reactor accident consequence analysis model, and the NUREG-1150 probabilistic risk assessments are used for illustration. 相似文献
2.
《Risk analysis》2018,38(1):163-176
The U.S. Environmental Protection Agency (EPA) uses health risk assessment to help inform its decisions in setting national ambient air quality standards (NAAQS). EPA's standard approach is to make epidemiologically‐based risk estimates based on a single statistical model selected from the scientific literature, called the “core” model. The uncertainty presented for “core” risk estimates reflects only the statistical uncertainty associated with that one model's concentration‐response function parameter estimate(s). However, epidemiologically‐based risk estimates are also subject to “model uncertainty,” which is a lack of knowledge about which of many plausible model specifications and data sets best reflects the true relationship between health and ambient pollutant concentrations. In 2002, a National Academies of Sciences (NAS) committee recommended that model uncertainty be integrated into EPA's standard risk analysis approach. This article discusses how model uncertainty can be taken into account with an integrated uncertainty analysis (IUA) of health risk estimates. It provides an illustrative numerical example based on risk of premature death from respiratory mortality due to long‐term exposures to ambient ozone, which is a health risk considered in the 2015 ozone NAAQS decision. This example demonstrates that use of IUA to quantitatively incorporate key model uncertainties into risk estimates produces a substantially altered understanding of the potential public health gain of a NAAQS policy decision, and that IUA can also produce more helpful insights to guide that decision, such as evidence of decreasing incremental health gains from progressive tightening of a NAAQS. 相似文献
3.
Mlonte Carlo Techniques for Quantitative Uncertainty Analysis in Public Health Risk Assessments 总被引:2,自引:0,他引:2
Most public health risk assessments assume and combine a series of average, conservative, and worst-case values to derive a conservative point estimate of risk. This procedure has major limitations. This paper demonstrates a new methodology for extended uncertainty analyses in public health risk assessments using Monte Carlo techniques. The extended method begins as do some conventional methods--with the preparation of a spreadsheet to estimate exposure and risk. This method, however, continues by modeling key inputs as random variables described by probability density functions (PDFs). Overall, the technique provides a quantitative way to estimate the probability distributions for exposure and health risks within the validity of the model used. As an example, this paper presents a simplified case study for children playing in soils contaminated with benzene and benzo(a)pyrene (BaP). 相似文献
4.
Uncertainty and Variability in Health-Related Damages from Coal-Fired Power Plants in the United States 总被引:1,自引:0,他引:1
The health‐related damages associated with emissions from coal‐fired power plants can vary greatly across facilities as a function of plant, site, and population characteristics, but the degree of variability and the contributing factors have not been formally evaluated. In this study, we modeled the monetized damages associated with 407 coal‐fired power plants in the United States, focusing on premature mortality from fine particulate matter (PM2.5). We applied a reduced‐form chemistry‐transport model accounting for primary PM2.5 emissions and the influence of sulfur dioxide (SO2) and nitrogen oxide (NOx) emissions on secondary particulate formation. Outputs were linked with a concentration‐response function for PM2.5‐related mortality that incorporated nonlinearities and model uncertainty. We valued mortality with a value of statistical life approach, characterizing and propagating uncertainties in all model elements. At the median of the plant‐specific uncertainty distributions, damages across plants ranged from $30,000 to $500,000 per ton of PM2.5, $6,000 to $50,000 per ton of SO2, $500 to $15,000 per ton of NOx, and $0.02 to $1.57 per kilowatt‐hour of electricity generated. Variability in damages per ton of emissions was almost entirely explained by population exposure per unit emissions (intake fraction), which itself was related to atmospheric conditions and the population size at various distances from the power plant. Variability in damages per kilowatt‐hour was highly correlated with SO2 emissions, related to fuel and control technology characteristics, but was also correlated with atmospheric conditions and population size at various distances. Our findings emphasize that control strategies that consider variability in damages across facilities would yield more efficient outcomes. 相似文献
5.
Mixed Levels of Uncertainty in Complex Policy Models 总被引:3,自引:0,他引:3
The characterization and treatment of uncertainty poses special challenges when modeling indeterminate or complex coupled systems such as those involved in the interactions between human activity, climate and the ecosystem. Uncertainty about model structure may become as, or more important than, uncertainty about parameter values. When uncertainty grows so large that prediction or optimization no longer makes sense, it may still be possible to use the model as a behavioral test bed to examine the relative robustness of alternative observational and behavioral strategies. When models must be run into portions of their phase space that are not well understood, different submodels may become unreliable at different rates. A common example involves running a time stepped model far into the future. Several strategies can be used to deal with such situations. The probability of model failure can be reported as a function of time. Possible alternative surprises can be assigned probabilities, modeled separately, and combined. Finally, through the use of subjective judgments, one may be able to combine, and over time shift between models, moving from more detailed to progressively simpler order-of-magnitude models, and perhaps ultimately, on to simple bounding analysis. 相似文献
6.
This article presents a new importance analysis framework, called parametric moment ratio function, for measuring the reduction of model output uncertainty when the distribution parameters of inputs are changed, and the emphasis is put on the mean and variance ratio functions with respect to the variances of model inputs. The proposed concepts efficiently guide the analyst to achieve a targeted reduction on the model output mean and variance by operating on the variances of model inputs. The unbiased and progressive unbiased Monte Carlo estimators are also derived for the parametric mean and variance ratio functions, respectively. Only a set of samples is needed for implementing the proposed importance analysis by the proposed estimators, thus the computational cost is free of input dimensionality. An analytical test example with highly nonlinear behavior is introduced for illustrating the engineering significance of the proposed importance analysis technique and verifying the efficiency and convergence of the derived Monte Carlo estimators. Finally, the moment ratio function is applied to a planar 10‐bar structure for achieving a targeted 50% reduction of the model output variance. 相似文献
7.
A sequence of linear, monotonic, and nonmonotonic test problems is used to illustrate sampling-based uncertainty and sensitivity analysis procedures. Uncertainty results obtained with replicated random and Latin hypercube samples are compared, with the Latin hypercube samples tending to produce more stable results than the random samples. Sensitivity results obtained with the following procedures and/or measures are illustrated and compared: correlation coefficients (CCs), rank correlation coefficients (RCCs), common means (CMNs), common locations (CLs), common medians (CMDs), statistical independence (SI), standardized regression coefficients (SRCs), partial correlation coefficients (PCCs), standardized rank regression coefficients (SRRCs), partial rank correlation coefficients (PRCCs), stepwise regression analysis with raw and rank-transformed data, and examination of scatter plots. The effectiveness of a given procedure and/or measure depends on the characteristics of the individual test problems, with (1) linear measures (i.e., CCs, PCCs, SRCs) performing well on the linear test problems, (2) measures based on rank transforms (i.e., RCCs, PRCCs, SRRCs) performing well on the monotonic test problems, and (3) measures predicated on searches for nonrandom patterns (i.e., CMNs, CLs, CMDs, SI) performing well on the nonmonotonic test problems. 相似文献
8.
Terje Aven 《Risk analysis》2010,30(3):354-360
It is common perspective in risk analysis that there are two kinds of uncertainties: i) variability as resulting from heterogeneity and stochasticity (aleatory uncertainty) and ii) partial ignorance or epistemic uncertainties resulting from systematic measurement error and lack of knowledge. Probability theory is recognized as the proper tool for treating the aleatory uncertainties, but there are different views on what is the best approach for describing partial ignorance and epistemic uncertainties. Subjective probabilities are often used for representing this type of ignorance and uncertainties, but several alternative approaches have been suggested, including interval analysis, probability bound analysis, and bounds based on evidence theory. It is argued that probability theory generates too precise results when the background knowledge of the probabilities is poor. In this article, we look more closely into this issue. We argue that this critique of probability theory is based on a conception of risk assessment being a tool to objectively report on the true risk and variabilities. If risk assessment is seen instead as a method for describing the analysts’ (and possibly other stakeholders’) uncertainties about unknown quantities, the alternative approaches (such as the interval analysis) often fail in providing the necessary decision support. 相似文献
9.
Safety systems are important components of high-consequence systems that are intended to prevent the unintended operation of the system and thus the potentially significant negative consequences that could result from such an operation. This presentation investigates and illustrates formal procedures for assessing the uncertainty in the probability that a safety system will fail to operate as intended in an accident environment. Probability theory and evidence theory are introduced as possible mathematical structures for the representation of the epistemic uncertainty associated with the performance of safety systems, and a representation of this type is illustrated with a hypothetical safety system involving one weak link and one strong link that is exposed to a high temperature fire environment. Topics considered include (1) the nature of diffuse uncertainty information involving a system and its environment, (2) the conversion of diffuse uncertainty information into the mathematical structures associated with probability theory and evidence theory, and (3) the propagation of these uncertainty structures through a model for a safety system to obtain representations in the context of probability theory and evidence theory of the uncertainty in the probability that the safety system will fail to operate as intended. The results suggest that evidence theory provides a potentially valuable representational tool for the display of the implications of significant epistemic uncertainty in inputs to complex analyses. 相似文献
10.
A better understanding of the uncertainty that exists in models used for seismic risk assessment is critical to improving risk-based decisions pertaining to earthquake safety. Current models estimating the probability of collapse of a building do not consider comprehensively the nature and impact of uncertainty. This article presents a model framework to enhance seismic risk assessment and thus gives decisionmakers a fuller understanding of the nature and limitations of the estimates. This can help ensure that risks are not over- or underestimated and the value of acquiring accurate data is appreciated fully. The methodology presented provides a novel treatment of uncertainties in input variables, their propagation through the model, and their effect on the results. The study presents ranges of possible annual collapse probabilities for different case studies on buildings in different parts of the world, exposed to different levels of seismicity, and with different vulnerabilities. A global sensitivity analysis was conducted to determine the significance of uncertain variables. Two key outcomes are (1) that the uncertainty in ground-motion conversion equations has the largest effect on the uncertainty in the calculation of annual collapse probability; and (2) the vulnerability of a building appears to have an effect on the range of annual collapse probabilities produced, i.e., the level of uncertainty in the estimate of annual collapse probability, with less vulnerable buildings having a smaller uncertainty. 相似文献
11.
Fayerweather William E. Collins James J. Schnatter A. Robert Hearne F. Terry Menning Reo A. Reyner Daniel P. 《Risk analysis》1999,19(6):1077-1090
A call for risk assessment approaches that better characterize and quantify uncertainty has been made by the scientific and regulatory community. This paper responds to that call by demonstrating a distributional approach that draws upon human data to derive potency estimates and to identify and quantify important sources of uncertainty. The approach is rooted in the science of decision analysis and employs an influence diagram, a decision tree, probabilistic weights, and a distribution of point estimates of carcinogenic potency. Its results estimate the likelihood of different carcinogenic risks (potencies) for a chemical under a specific scenario. For this exercise, human data on formaldehyde were employed to demonstrate the approach. Sensitivity analyses were performed to determine the relative impact of specific levels and alternatives on the potency distribution. The resulting potency estimates are compared with the results of an exercise using animal data on formaldehyde. The paper demonstrates that distributional risk assessment is readily adapted to situations in which epidemiologic data serve as the basis for potency estimates. Strengths and weaknesses of the distributional approach are discussed. Areas for further application and research are recommended. 相似文献
12.
Indirect exposures to 2,3,7,8-tetrachlorodibenzo- p -dioxin (TCDD) and other toxic materials released in incinerator emissions have been identified as a significant concern for human health. As a result, regulatory agencies and researchers have developed specific approaches for evaluating exposures from indirect pathways. This paper presents a quantitative assessment of the effect of uncertainty and variation in exposure parameters on the resulting estimates of TCDD dose rates received by individuals indirectly exposed to incinerator emissions through the consumption of home-grown beef. The assessment uses a nested Monte Carlo model that separately characterizes uncertainty and variation in dose rate estimates. Uncertainty resulting from limited data on the fate and transport of TCDD are evaluated, and variations in estimated dose rates in the exposed population that result from location-specific parameters and individuals'behaviors are characterized. The analysis indicates that lifetime average daily dose rates for individuals living within 10 km of a hypothetical incinerator range over three orders of magnitude. In contrast, the uncertainty in the dose rate distribution appears to vary by less than one order of magnitude, based on the sources of uncertainty included in this analysis. Current guidance for predicting exposures from indirect exposure pathways was found to overestimate the intakes for typical and high-end individuals. 相似文献
13.
Uncertainty Analysis in Multiplicative Models 总被引:3,自引:0,他引:3
Wout Slob 《Risk analysis》1994,14(4):571-576
Uncertainties are usually evaluated by Monte Carlo analysis. However, multiplicative models with lognormal uncertainties, which are ubiquitous in risk assessments, allow for a simple and quick analytical uncertainty analysis. The necessary formulae are given, which may be evaluated by a desk calculator. Two examples illustrate the method. 相似文献
14.
In the analysis of the risk associated to rare events that may lead to catastrophic consequences with large uncertainty, it is questionable that the knowledge and information available for the analysis can be reflected properly by probabilities. Approaches other than purely probabilistic have been suggested, for example, using interval probabilities, possibilistic measures, or qualitative methods. In this article, we look into the problem and identify a number of issues that are foundational for its treatment. The foundational issues addressed reflect on the position that “probability is perfect” and take into open consideration the need for an extended framework for risk assessment that reflects the separation that practically exists between analyst and decisionmaker. 相似文献
15.
Ronald L. Iman 《Risk analysis》1987,7(1):21-33
System unavailabilities for large complex systems such as nuclear power plants are often evaluated through use of fault tree analysis. The system unavailability is obtained from a Boolean representation of a system fault tree. Even after truncation of higher order terms these expressions can be quite large, involving thousands of terms. A general matrix notation is proposed for the representation of Boolean expressions which facilitates uncertainty and sensitivity analysis calculations. 相似文献
16.
Environmental tobacco smoke (ETS) is a major contributor to indoor human exposures to fine particulate matter of 2.5 μm or smaller (PM2.5). The Stochastic Human Exposure and Dose Simulation for Particulate Matter (SHEDS‐PM) Model developed by the U.S. Environmental Protection Agency estimates distributions of outdoor and indoor PM2.5 exposure for a specified population based on ambient concentrations and indoor emissions sources. A critical assessment was conducted of the methodology and data used in SHEDS‐PM for estimation of indoor exposure to ETS. For the residential microenvironment, SHEDS uses a mass‐balance approach, which is comparable to best practices. The default inputs in SHEDS‐PM were reviewed and more recent and extensive data sources were identified. Sensitivity analysis was used to determine which inputs should be prioritized for updating. Data regarding the proportion of smokers and “other smokers” and cigarette emission rate were found to be important. SHEDS‐PM does not currently account for in‐vehicle ETS exposure; however, in‐vehicle ETS‐related PM2.5 levels can exceed those in residential microenvironments by a factor of 10 or more. Therefore, a mass‐balance‐based methodology for estimating in‐vehicle ETS PM2.5 concentration is evaluated. Recommendations are made regarding updating of input data and algorithms related to ETS exposure in the SHEDS‐PM model. Interindividual variability for ETS exposure was quantified. Geographic variability in ETS exposure was quantified based on the varying prevalence of smokers in five selected locations in the United States. 相似文献
17.
A Combined Monte Carlo and Possibilistic Approach to Uncertainty Propagation in Event Tree Analysis 总被引:1,自引:0,他引:1
In risk analysis, the treatment of the epistemic uncertainty associated to the probability of occurrence of an event is fundamental. Traditionally, probabilistic distributions have been used to characterize the epistemic uncertainty due to imprecise knowledge of the parameters in risk models. On the other hand, it has been argued that in certain instances such uncertainty may be best accounted for by fuzzy or possibilistic distributions. This seems the case in particular for parameters for which the information available is scarce and of qualitative nature. In practice, it is to be expected that a risk model contains some parameters affected by uncertainties that may be best represented by probability distributions and some other parameters that may be more properly described in terms of fuzzy or possibilistic distributions. In this article, a hybrid method that jointly propagates probabilistic and possibilistic uncertainties is considered and compared with pure probabilistic and pure fuzzy methods for uncertainty propagation. The analyses are carried out on a case study concerning the uncertainties in the probabilities of occurrence of accident sequences in an event tree analysis of a nuclear power plant. 相似文献
18.
Adam M. Finkel 《Risk analysis》1994,14(5):751-761
Advocates of quantitative uncertainty analysis (QUA) have invested substantial effort in explaining why uncertainty is a crucial aspect of risk and yet have devoted much less effort to explaining how QUA can improve the risk manager's performance. This paper develops a teaching example, using a personal decision problem with subtle parallels to societal risk management, to show how choices made with increasing appreciation of uncertainty are superior ones. In the hypothetical, five analysts explain the same uncertain prospect (whether to invest in a volatile stock issue), with increasing attention to the nuances of uncertainty. The path through these different perspectives on the decision demonstrates four general points applicable to environmental risk management: (1) Various point estimates with equal claim to being "best estimates" can differ markedly from each other and lead to diametrically different choices; (2) "conservatism" has both relative and absolute meanings, with different implications for decision-making; (3) both inattention to and fixation on "outliers" in the uncertainty distribution can lead the manager astray; and (4) the best QUA is one that helps discriminate among real options, that points to optimum pathways toward new information, and that spurs on the iterative search for new decision options that may outperform any of the initial ones offered. 相似文献
19.
Benefit–cost analysis is widely used to evaluate alternative courses of action that are designed to achieve policy objectives. Although many analyses take uncertainty into account, they typically only consider uncertainty about cost estimates and physical states of the world, whereas uncertainty about individual preferences, thus the benefit of policy intervention, is ignored. Here, we propose a strategy to integrate individual uncertainty about preferences into benefit–cost analysis using societal preference intervals, which are ranges of values over which it is unclear whether society as a whole should accept or reject an option. To illustrate the method, we use preferences for implementing a smart grid technology to sustain critical electricity demand during a 24‐hour regional power blackout on a hot summer weekend. Preferences were elicited from a convenience sample of residents in Allegheny County, Pennsylvania. This illustrative example shows that uncertainty in individual preferences, when aggregated to form societal preference intervals, can substantially change society's decision. We conclude with a discussion of where preference uncertainty comes from, how it might be reduced, and why incorporating unresolved preference uncertainty into benefit–cost analyses can be important. 相似文献
20.
Probabilistic seismic risk analysis is a well‐established method in the insurance industry for modeling portfolio losses from earthquake events. In this context, precise exposure locations are often unknown. However, so far, location uncertainty has not been in the focus of a large amount of research. In this article, we propose a novel framework for treatment of location uncertainty. As a case study, a large number of synthetic portfolios resembling typical real‐world cases were created. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on the variability of loss frequency estimations. The results indicate that due to loss aggregation effects and spatial hazard variability, location uncertainty in isolation and in conjunction with ground motion uncertainty can induce significant variability to probabilistic loss results, especially for portfolios with a small number of risks. After quantifying its effect, we conclude that location uncertainty should not be neglected when assessing probabilistic seismic risk, but should be treated stochastically and the resulting variability should be visualized and interpreted carefully. 相似文献