共查询到20条相似文献,搜索用时 15 毫秒
1.
The Repeatability of Uncertainty and Sensitivity Analyses for Complex Probabilistic Risk Assessments 总被引:2,自引:0,他引:2
The performance of a probabilistic risk assessment (PRA) for a nuclear power plant is a complex undertaking, involving the assembly of an accident frequency analysis, an accident progression analysis, a source term analysis, and a consequence analysis. Each of these analyses is, in itself, quite complex. Uncertainties enter into a PRA from each of these analyses. An important focus in recent PRAs has been to incorporate these uncertainties at each stage of the analysis, propagate the subsequent uncertainties through the entire analysis, and include uncertainty in the final results. Monte Carlo procedures based on Latin hypercube sampling provide one way to perform propagations of this type. In this paper, the results of two complete and independent Monte Carlo calculations for a recently completed PRA for a nuclear power plant are compared as a means of providing empirical evidence on the repeatability of uncertainty and sensitivity analyses for large-scale PRA calculations. These calculations use the same variables and analysis structure with two independently generated Latin hypercube samples. The results of the two calculations show a high degree of repeatability for the analysis of a very complex system. 相似文献
2.
An Investigation of Uncertainty and Sensitivity Analysis Techniques for Computer Models 总被引:12,自引:0,他引:12
Many different techniques have been proposed for performing uncertainty and sensitivity analyses on computer models for complex processes. The objective of the present study is to investigate the applicability of three widely used techniques to three computer models having large uncertainties and varying degrees of complexity in order to highlight some of the problem areas that must be addressed in actual applications. The following approaches to uncertainty and sensitivity analysis are considered: (1) response surface methodology based on input determined from a fractional factorial design; (2) Latin hypercube sampling with and without regression analysis; and (3) differential analysis. These techniques are investigated with respect to (1) ease of implementation, (2) flexibility, (3) estimation of the cumulative distribution function of the output, and (4) adaptability to different methods of sensitivity analysis. With respect to these criteria, the technique using Latin hypercube sampling and regression analysis had the best overall performance. The models used in the investigation are well documented, thus making it possible for researchers to make comparisons of other techniques with the results in this study. 相似文献
3.
Mlonte Carlo Techniques for Quantitative Uncertainty Analysis in Public Health Risk Assessments 总被引:2,自引:0,他引:2
Most public health risk assessments assume and combine a series of average, conservative, and worst-case values to derive a conservative point estimate of risk. This procedure has major limitations. This paper demonstrates a new methodology for extended uncertainty analyses in public health risk assessments using Monte Carlo techniques. The extended method begins as do some conventional methods--with the preparation of a spreadsheet to estimate exposure and risk. This method, however, continues by modeling key inputs as random variables described by probability density functions (PDFs). Overall, the technique provides a quantitative way to estimate the probability distributions for exposure and health risks within the validity of the model used. As an example, this paper presents a simplified case study for children playing in soils contaminated with benzene and benzo(a)pyrene (BaP). 相似文献
4.
Methods for Uncertainty Analysis: A Comparative Survey 总被引:1,自引:0,他引:1
This paper presents a survey and comparative evaluation of methods which have been developed for the determination of uncertainties in accident consequences and probabilities, for use in probabilistic risk assessment. The methods considered are: analytic techniques, Monte Carlo simulation, response surface approaches, differential sensitivity techniques, and evaluation of classical statistical confidence bounds. It is concluded that only the response surface and differential sensitivity approaches are sufficiently general and flexible for use as overall methods of uncertainty analysis in probabilistic risk assessment. The other methods considered, however, are very useful in particular problems. 相似文献
5.
A sequence of linear, monotonic, and nonmonotonic test problems is used to illustrate sampling-based uncertainty and sensitivity analysis procedures. Uncertainty results obtained with replicated random and Latin hypercube samples are compared, with the Latin hypercube samples tending to produce more stable results than the random samples. Sensitivity results obtained with the following procedures and/or measures are illustrated and compared: correlation coefficients (CCs), rank correlation coefficients (RCCs), common means (CMNs), common locations (CLs), common medians (CMDs), statistical independence (SI), standardized regression coefficients (SRCs), partial correlation coefficients (PCCs), standardized rank regression coefficients (SRRCs), partial rank correlation coefficients (PRCCs), stepwise regression analysis with raw and rank-transformed data, and examination of scatter plots. The effectiveness of a given procedure and/or measure depends on the characteristics of the individual test problems, with (1) linear measures (i.e., CCs, PCCs, SRCs) performing well on the linear test problems, (2) measures based on rank transforms (i.e., RCCs, PRCCs, SRRCs) performing well on the monotonic test problems, and (3) measures predicated on searches for nonrandom patterns (i.e., CMNs, CLs, CMDs, SI) performing well on the nonmonotonic test problems. 相似文献
6.
Measures of sensitivity and uncertainty have become an integral part of risk analysis. Many such measures have a conditional probabilistic structure, for which a straightforward Monte Carlo estimation procedure has a double‐loop form. Recently, a more efficient single‐loop procedure has been introduced, and consistency of this procedure has been demonstrated separately for particular measures, such as those based on variance, density, and information value. In this work, we give a unified proof of single‐loop consistency that applies to any measure satisfying a common rationale. This proof is not only more general but invokes less restrictive assumptions than heretofore in the literature, allowing for the presence of correlations among model inputs and of categorical variables. We examine numerical convergence of such an estimator under a variety of sensitivity measures. We also examine its application to a published medical case study. 相似文献
7.
Variance-Based Importance Analysis Applied to a Complex Probabilistic Performance Assessment 总被引:1,自引:0,他引:1
Randall D. Manteufel 《Risk analysis》1996,16(4):587-598
The most important input parameters in a complex probabilistic performance assessment are identified using a variance-based method and compared with those identified using a regression-based method. The variance-based method has the advantage of not requiring assumptions about the functional relationship between input and output parameters. However, it has the drawback of requiring heuristic assessments of threshold variance ratios above which a parameter is considered important, and it also requires numerous executions of the computer program, which may be computationally expensive. Both methods identified the same top 5 and 7 of the top 10 most important parameters for a system having 195 inputs. Although no distinct advantage for the variance-based approach was identified, the ideas which motivate the new approach are sound and suggest new avenues for exploring the relationships between the inputs and the output of a complex system. 相似文献
8.
J. C. Helton D. R. Anderson H.-N. Jow M. G. Marietta G. Basabilvazo 《Risk analysis》1999,19(5):959-986
The conceptual and computational structure of a performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP) is described. Important parts of this structure are (1) maintenance of a separation between stochastic (i.e., aleatory) and subjective (i.e., epistemic) uncertainty, with stochastic uncertainty arising from the many possible disruptions that could occur over the 10,000-year regulatory period that applies to the WIPP, and subjective uncertainty arising from the imprecision with which many of the quantities required in the analysis are known, (2) use of Latin hypercube sampling to incorporate the effects of subjective uncertainty, (3) use of Monte Carlo (i.e., random) sampling to incorporate the effects of stochastic uncertainty, and (4) efficient use of the necessarily limited number of mechanistic calculations that can be performed to support the analysis. The WIPP is under development by the U.S. Department of Energy (DOE) for the geologic (i.e., deep underground) disposal of transuranic (TRU) waste, with the indicated PA supporting a Compliance Certification Application (CCA) by the DOE to the U.S. Environmental Protection Agency (EPA) in October 1996 for the necessary certifications for the WIPP to begin operation. The EPA certified the WIPP for the disposal of TRU waste in May 1998, with the result that the WIPP will be the first operational facility in the United States for the geologic disposal of radioactive waste. 相似文献
9.
Measuring Uncertainty Importance: Investigation and Comparison of Alternative Approaches 总被引:2,自引:0,他引:2
Emanuele Borgonovo 《Risk analysis》2006,26(5):1349-1361
Uncertainty importance measures are quantitative tools aiming at identifying the contribution of uncertain inputs to output uncertainty. Their application ranges from food safety (Frey & Patil (2002)) to hurricane losses (Iman et al. (2005a, 2005b)). Results and indications an analyst derives depend on the method selected for the study. In this work, we investigate the assumptions at the basis of various indicator families to discuss the information they convey to the analyst/decisionmaker. We start with nonparametric techniques, and then present variance-based methods. By means of an example we show that output variance does not always reflect a decisionmaker state of knowledge of the inputs. We then examine the use of moment-independent approaches to global sensitivity analysis, i.e., techniques that look at the entire output distribution without a specific reference to its moments. Numerical results demonstrate that both moment-independent and variance-based indicators agree in identifying noninfluential parameters. However, differences in the ranking of the most relevant factors show that inputs that influence variance the most are not necessarily the ones that influence the output uncertainty distribution the most. 相似文献
10.
Nicolas Miconnet Marie Cornu Annie Beaufort Laurent Rosso Jean-Baptiste Denis 《Risk analysis》2005,25(1):39-48
The uncertainty associated with estimates should be taken into account in quantitative risk assessment. Each input's uncertainty can be characterized through a probabilistic distribution for use under Monte Carlo simulations. In this study, the sampling uncertainty associated with estimating a low proportion on the basis of a small sample size was considered. A common application in microbial risk assessment is the estimation of a prevalence, proportion of contaminated food products, on the basis of few tested units. Three Bayesian approaches (based on beta(0, 0), beta(1/2, 1/2), and beta(l, 1)) and one frequentist approach (based on the frequentist confidence distribution) were compared and evaluated on the basis of simulations. For small samples, we demonstrated some differences between the four tested methods. We concluded that the better method depends on the true proportion of contaminated products, which is by definition unknown in common practice. When no prior information is available, we recommend the beta (1/2, 1/2) prior or the confidence distribution. To illustrate the importance of these differences, the four methods were used in an applied example. We performed two-dimensional Monte Carlo simulations to estimate the proportion of cold smoked salmon packs contaminated by Listeria monocytogenes, one dimension representing within-factory uncertainty, modeled by each of the four studied methods, and the other dimension representing variability between companies. 相似文献
11.
Principles of Good Practice for the Use of Monte Carlo Techniques in Human Health and Ecological Risk Assessments 总被引:6,自引:1,他引:6
We propose 14 principles of good practice to assist people in performing and reviewing probabilistic or Monte Carlo risk assessments, especially in the context of the federal and state statutes concerning chemicals in the environment. Monte Carlo risk assessments for hazardous waste sites that follow these principles will be easier to understand, will explicitly distinguish assumptions from data, and will consider and quantify effects that could otherwise lead to misinterpretation of the results. The proposed principles are neither mutually exclusive nor collectively exhaustive. We think and hope that these principles will evolve as new ideas arise and come into practice. 相似文献
12.
Probabilistic seismic risk analysis is a well‐established method in the insurance industry for modeling portfolio losses from earthquake events. In this context, precise exposure locations are often unknown. However, so far, location uncertainty has not been in the focus of a large amount of research. In this article, we propose a novel framework for treatment of location uncertainty. As a case study, a large number of synthetic portfolios resembling typical real‐world cases were created. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on the variability of loss frequency estimations. The results indicate that due to loss aggregation effects and spatial hazard variability, location uncertainty in isolation and in conjunction with ground motion uncertainty can induce significant variability to probabilistic loss results, especially for portfolios with a small number of risks. After quantifying its effect, we conclude that location uncertainty should not be neglected when assessing probabilistic seismic risk, but should be treated stochastically and the resulting variability should be visualized and interpreted carefully. 相似文献
13.
Uncertainty Analysis in Multiplicative Models 总被引:3,自引:0,他引:3
Wout Slob 《Risk analysis》1994,14(4):571-576
Uncertainties are usually evaluated by Monte Carlo analysis. However, multiplicative models with lognormal uncertainties, which are ubiquitous in risk assessments, allow for a simple and quick analytical uncertainty analysis. The necessary formulae are given, which may be evaluated by a desk calculator. Two examples illustrate the method. 相似文献
14.
Anne E. Smith 《Risk analysis》2020,40(3):442-449
Regulatory impact analyses (RIAs), required for new major federal regulations, are often criticized for not incorporating epistemic uncertainties into their quantitative estimates of benefits and costs. “Integrated uncertainty analysis,” which relies on subjective judgments about epistemic uncertainty to quantitatively combine epistemic and statistical uncertainties, is often prescribed. This article identifies an additional source for subjective judgment regarding a key epistemic uncertainty in RIAs for National Ambient Air Quality Standards (NAAQS)—the regulator's degree of confidence in continuation of the relationship between pollutant concentration and health effects at varying concentration levels. An illustrative example is provided based on the 2013 decision on the NAAQS for fine particulate matter (PM2.5). It shows how the regulator's justification for setting that NAAQS was structured around the regulator's subjective confidence in the continuation of health risks at different concentration levels, and it illustrates how such expressions of uncertainty might be directly incorporated into the risk reduction calculations used in the rule's RIA. The resulting confidence-weighted quantitative risk estimates are found to be substantially different from those in the RIA for that rule. This approach for accounting for an important source of subjective uncertainty also offers the advantage of establishing consistency between the scientific assumptions underlying RIA risk and benefit estimates and the science-based judgments developed when deciding on the relevant standards for important air pollutants such as PM2.5. 相似文献
15.
Human health risk assessments use point values to develop risk estimates and thus impart a deterministic character to risk, which, by definition, is a probability phenomenon. The risk estimates are calculated based on individuals and then, using uncertainty factors (UFs), are extrapolated to the population that is characterized by variability. Regulatory agencies have recommended the quantification of the impact of variability in risk assessments through the application of probabilistic methods. In the present study, a framework that deals with the quantitative analysis of uncertainty (U) and variability (V) in target tissue dose in the population was developed by applying probabilistic analysis to physiologically-based toxicokinetic models. The mechanistic parameters that determine kinetics were described with probability density functions (PDFs). Since each PDF depicts the frequency of occurrence of all expected values of each parameter in the population, the combined effects of multiple sources of U/V were accounted for in the estimated distribution of tissue dose in the population, and a unified (adult and child) intraspecies toxicokinetic uncertainty factor UFH-TK was determined. The results show that the proposed framework accounts effectively for U/V in population toxicokinetics. The ratio of the 95th percentile to the 50th percentile of the annual average concentration of the chemical at the target tissue organ (i.e., the UFH-TK) varies with age. The ratio is equivalent to a unified intraspecies toxicokinetic UF, and it is one of the UFs by which the NOAEL can be divided to obtain the RfC/RfD. The 10-fold intraspecies UF is intended to account for uncertainty and variability in toxicokinetics (3.2x) and toxicodynamics (3.2x). This article deals exclusively with toxicokinetic component of UF. The framework provides an alternative to the default methodology and is advantageous in that the evaluation of toxicokinetic variability is based on the distribution of the effective target tissue dose, rather than applied dose. It allows for the replacement of the default adult and children intraspecies UF with toxicokinetic data-derived values and provides accurate chemical-specific estimates for their magnitude. It shows that proper application of probability and toxicokinetic theories can reduce uncertainties when establishing exposure limits for specific compounds and provide better assurance that established limits are adequately protective. It contributes to the development of a probabilistic noncancer risk assessment framework and will ultimately lead to the unification of cancer and noncancer risk assessment methodologies. 相似文献
16.
Comparative risk projects can provide broad policy guidance but they rarely have adequate scientific foundations to support precise risk rankings. Many extant projects report rankings anyway, with limited attention to uncertainty. Stochastic uncertainty, structural uncertainty, and ignorance are types of incertitude that afflict risk comparisons. The recently completed New Jersey Comparative Risk Project was innovative in trying to acknowledge and accommodate some historically ignored uncertainties in a substantive manner. This article examines the methods used and lessons learned from the New Jersey project. Monte Carlo techniques were used to characterize stochastic uncertainty, and sensitivity analysis helped to manage structural uncertainty. A deliberative process and a sorting technique helped manage ignorance. Key findings are that stochastic rankings can be calculated but they reveal such an alarming degree of imprecision that the rankings are no longer useful, whereas sorting techniques are helpful in spite of uncertainty. A deliberative process is helpful to counter analytical overreaching. 相似文献
17.
Uncertainty Analysis Based on Probability Bounds (P-Box) Approach in Probabilistic Safety Assessment 总被引:4,自引:0,他引:4
Durga Rao Karanki Hari Shankar Kushwaha Ajit Kumar Verma Srividya Ajit 《Risk analysis》2009,29(5):662-675
A wide range of uncertainties will be introduced inevitably during the process of performing a safety assessment of engineering systems. The impact of all these uncertainties must be addressed if the analysis is to serve as a tool in the decision-making process. Uncertainties present in the components (input parameters of model or basic events) of model output are propagated to quantify its impact in the final results. There are several methods available in the literature, namely, method of moments, discrete probability analysis, Monte Carlo simulation, fuzzy arithmetic, and Dempster-Shafer theory. All the methods are different in terms of characterizing at the component level and also in propagating to the system level. All these methods have different desirable and undesirable features, making them more or less useful in different situations. In the probabilistic framework, which is most widely used, probability distribution is used to characterize uncertainty. However, in situations in which one cannot specify (1) parameter values for input distributions, (2) precise probability distributions (shape), and (3) dependencies between input parameters, these methods have limitations and are found to be not effective. In order to address some of these limitations, the article presents uncertainty analysis in the context of level-1 probabilistic safety assessment (PSA) based on a probability bounds (PB) approach. PB analysis combines probability theory and interval arithmetic to produce probability boxes (p-boxes), structures that allow the comprehensive propagation through calculation in a rigorous way. A practical case study is also carried out with the developed code based on the PB approach and compared with the two-phase Monte Carlo simulation results. 相似文献
18.
Projecting losses associated with hurricanes is a complex and difficult undertaking that is wrought with uncertainties. Hurricane Charley, which struck southwest Florida on August 13, 2004, illustrates the uncertainty of forecasting damages from these storms. Due to shifts in the track and the rapid intensification of the storm, real-time estimates grew from 2 to 3 billion dollars in losses late on August 12 to a peak of 50 billion dollars for a brief time as the storm appeared to be headed for the Tampa Bay area. The storm hit the resort areas of Charlotte Harbor near Punta Gorda and then went on to Orlando in the central part of the state, with early poststorm estimates converging on a damage estimate in the 28 to 31 billion dollars range. Comparable damage to central Florida had not been seen since Hurricane Donna in 1960. The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) has recognized the role of computer models in projecting losses from hurricanes. The FCHLPM established a professional team to perform onsite (confidential) audits of computer models developed by several different companies in the United States that seek to have their models approved for use in insurance rate filings in Florida. The team's members represent the fields of actuarial science, computer science, meteorology, statistics, and wind and structural engineering. An important part of the auditing process requires uncertainty and sensitivity analyses to be performed with the applicant's proprietary model. To influence future such analyses, an uncertainty and sensitivity analysis has been completed for loss projections arising from use of a Holland B parameter hurricane wind field model. Uncertainty analysis quantifies the expected percentage reduction in the uncertainty of wind speed and loss that is attributable to each of the input variables. 相似文献
19.
20.
A Combined Monte Carlo and Possibilistic Approach to Uncertainty Propagation in Event Tree Analysis 总被引:1,自引:0,他引:1
In risk analysis, the treatment of the epistemic uncertainty associated to the probability of occurrence of an event is fundamental. Traditionally, probabilistic distributions have been used to characterize the epistemic uncertainty due to imprecise knowledge of the parameters in risk models. On the other hand, it has been argued that in certain instances such uncertainty may be best accounted for by fuzzy or possibilistic distributions. This seems the case in particular for parameters for which the information available is scarce and of qualitative nature. In practice, it is to be expected that a risk model contains some parameters affected by uncertainties that may be best represented by probability distributions and some other parameters that may be more properly described in terms of fuzzy or possibilistic distributions. In this article, a hybrid method that jointly propagates probabilistic and possibilistic uncertainties is considered and compared with pure probabilistic and pure fuzzy methods for uncertainty propagation. The analyses are carried out on a case study concerning the uncertainties in the probabilities of occurrence of accident sequences in an event tree analysis of a nuclear power plant. 相似文献