首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Estimation of uncertainties associated with model predictions is an important component of the application of environmental and biological models. "Traditional" methods for propagating uncertainty, such as standard Monte Carlo and Latin Hypercube Sampling, however, often require performing a prohibitive number of model simulations, especially for complex, computationally intensive models. Here, a computationally efficient method for uncertainty propagation, the Stochastic Response Surface Method (SRSM) is coupled with another method, the Automatic Differentiation of FORTRAN (ADIFOR). The SRSM is based on series expansions of model inputs and outputs in terms of a set of "well-behaved" standard random variables. The ADIFOR method is used to transform the model code into one that calculates the derivatives of the model outputs with respect to inputs or transformed inputs. The calculated model outputs and the derivatives at a set of sample points are used to approximate the unknown coefficients in the series expansions of outputs. A framework for the coupling of the SRSM and ADIFOR is developed and presented here. Two case studies are presented, involving (1) a physiologically based pharmacokinetic model for perchloroethylene for humans, and (2) an atmospheric photochemical model, the Reactive Plume Model. The results obtained agree closely with those of traditional Monte Carlo and Latin hypercube sampling methods, while reducing the required number of model simulations by about two orders of magnitude.  相似文献   

2.
Many different techniques have been proposed for performing uncertainty and sensitivity analyses on computer models for complex processes. The objective of the present study is to investigate the applicability of three widely used techniques to three computer models having large uncertainties and varying degrees of complexity in order to highlight some of the problem areas that must be addressed in actual applications. The following approaches to uncertainty and sensitivity analysis are considered: (1) response surface methodology based on input determined from a fractional factorial design; (2) Latin hypercube sampling with and without regression analysis; and (3) differential analysis. These techniques are investigated with respect to (1) ease of implementation, (2) flexibility, (3) estimation of the cumulative distribution function of the output, and (4) adaptability to different methods of sensitivity analysis. With respect to these criteria, the technique using Latin hypercube sampling and regression analysis had the best overall performance. The models used in the investigation are well documented, thus making it possible for researchers to make comparisons of other techniques with the results in this study.  相似文献   

3.
Comprehensive uncertainty analyses of complex models of environmental and biological systems are essential but often not feasible due to the computational resources they require. "Traditional" methods, such as standard Monte Carlo and Latin Hypercube Sampling, for propagating uncertainty and developing probability densities of model outputs, may in fact require performing a prohibitive number of model simulations. An alternative is offered, for a wide range of problems, by the computationally efficient "Stochastic Response Surface Methods (SRSMs)" for uncertainty propagation. These methods extend the classical response surface methodology to systems with stochastic inputs and outputs. This is accomplished by approximating both inputs and outputs of the uncertain system through stochastic series of "well behaved" standard random variables; the series expansions of the outputs contain unknown coefficients which are calculated by a method that uses the results of a limited number of model simulations. Two case studies are presented here involving (a) a physiologically-based pharmacokinetic (PBPK) model for perchloroethylene (PERC) for humans, and (b) an atmospheric photochemical model, the Reactive Plume Model (RPM-IV). The results obtained agree closely with those of traditional Monte Carlo and Latin Hypercube Sampling methods, while significantly reducing the required number of model simulations.  相似文献   

4.
The conceptual and computational structure of a performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP) is described. Important parts of this structure are (1) maintenance of a separation between stochastic (i.e., aleatory) and subjective (i.e., epistemic) uncertainty, with stochastic uncertainty arising from the many possible disruptions that could occur over the 10,000-year regulatory period that applies to the WIPP, and subjective uncertainty arising from the imprecision with which many of the quantities required in the analysis are known, (2) use of Latin hypercube sampling to incorporate the effects of subjective uncertainty, (3) use of Monte Carlo (i.e., random) sampling to incorporate the effects of stochastic uncertainty, and (4) efficient use of the necessarily limited number of mechanistic calculations that can be performed to support the analysis. The WIPP is under development by the U.S. Department of Energy (DOE) for the geologic (i.e., deep underground) disposal of transuranic (TRU) waste, with the indicated PA supporting a Compliance Certification Application (CCA) by the DOE to the U.S. Environmental Protection Agency (EPA) in October 1996 for the necessary certifications for the WIPP to begin operation. The EPA certified the WIPP for the disposal of TRU waste in May 1998, with the result that the WIPP will be the first operational facility in the United States for the geologic disposal of radioactive waste.  相似文献   

5.
A general discussion of knowledge dependence in risk calculations shows that the assumption of independence underlying standard Monte Carlo simulation in uncertainty analysis is frequently violated. A model is presented for performing Monte Carlo simulation when the variabilities of the component failure probabilities are either negatively or positively coupled. The model is applied to examples in human reliability analysis and the results are compared to the results of Sandia Laboratories as published in the Peer Review Study and to recalculations using more recent methods of uncertainty analysis.  相似文献   

6.
This paper presents the results of a study that identified how often a probabilistic risk assessment (PRA)should be updated to accommodate the changes that take place at nuclear power plants. Based on a 7-year analysis of design and procedural changes at one plant, we consider 5 years to be the maximum interval for updating PRAs. This conclusion is preliminary because it is based on the review of changes that occurred at a single plant, and it addresses only PRAs that involve a Level 1 analysis (i.e., a PRA including calculation of core damage frequency only). Nevertheless, this conclusion indicates that maintaining a useful PRA requires periodic updating efforts. However, the need for this periodic update stems only partly from the number of changes that can be expected to take place at nuclear power plants–changes that individually have only a moderate to minor impact on the PRA, but whose combined impact is substantial and necessitates a PRA update. Additionally, a comparison of two generations of PRAs performed about 5 years apart indicates that PRAs must be periodically updated to reflect the evolution of PRA methods. The most desirable updating interval depends on these two technical considerations as well as the cost of updating the PRA. (Cost considerations, however, were beyond the scope of this study.)  相似文献   

7.
Methods to Approximate Joint Uncertainty and Variability in Risk   总被引:3,自引:0,他引:3  
As interest in quantitative analysis of joint uncertainty and interindividual variability (JUV) in risk grows, so does the need for related computational shortcuts. To quantify JUV in risk, Monte Carlo methods typically require nested sampling of JUV in distributed inputs, which is cumbersome and time-consuming. Two approximation methods proposed here allow simpler and more rapid analysis. The first consists of new upper-bound JUV estimators that involve only uncertainty or variability, not both, and so never require nested sampling to calculate. The second is a discrete-probability-calculus procedure that uses only the mean and one upper-tail mean for each input in order to estimate mean and upper-bound risk, which procedure is simpler and more intuitive than similar ones in use. Application of these methods is illustrated in an assessment of cancer risk from residential exposures to chloroform in Kanawah Valley, West Virginia. Because each of the multiple exposure pathways considered in this assessment had separate modeled sources of uncertainty and variability, the assessment illustrates a realistic case where a standard Monte Carlo approach to JUV analysis requires nested sampling. In the illustration, the first proposed method quantified JUV in cancer risk much more efficiently than corresponding nested Monte Carlo calculations. The second proposed method also nearly duplicated JUV-related and other estimates of risk obtained using Monte Carlo methods. Both methods were thus found adequate to obtain basic risk estimates accounting for JUV in a realistically complex risk assessment. These methods make routine JUV analysis more convenient and practical.  相似文献   

8.
The elements of societal risk from a nuclear power plant accident are clearly illustrated by the Fukushima accident: land contamination, long‐term relocation of large numbers of people, loss of productive farm area, loss of industrial production, and significant loss of electric capacity. NUREG‐1150 and other studies have provided compelling evidence that the individual health risk of nuclear power plant accidents is effectively negligible relative to other comparable risks, even for people living in close proximity to a plant. The objective of this study is to compare the societal risk of nuclear power plant accidents to that of other events to which the public is exposed. We have characterized the monetized societal risk in the United States from major societally disruptive events, such as hurricanes, in the form of a complementary cumulative distribution function. These risks are compared with nuclear power plant risks, based on NUREG‐1150 analyses and new MACCS code calculations to account for differences in source terms determined in the more recent SOARCA study. A candidate quantitative societal objective is discussed for potential adoption by the NRC. The results are also interpreted with regard to the acceptability of nuclear power as a major source of future energy supply.  相似文献   

9.
In the nuclear power industry, Level 3 probabilistic risk assessment (PRA) is used to estimate damage to public health and the environment if a severe accident leads to large radiological release. Current Level 3 PRA does not have an explicit inclusion of social factors and, therefore, it is not possible to perform importance ranking of social factors for risk‐informing emergency preparedness, planning, and response (EPPR). This article offers a methodology for adapting the concept of social vulnerability, commonly used in natural hazard research, in the context of a severe nuclear power plant accident. The methodology has four steps: (1) calculating a hazard‐independent social vulnerability index for the local population; (2) developing a location‐specific representation of the maximum radiological hazard estimated from current Level 3 PRA, in a geographic information system (GIS) environment; (3) developing a GIS‐based socio‐technical risk map by combining the social vulnerability index and the location‐specific radiological hazard; and (4) conducting a risk importance measure analysis to rank the criticality of social factors based on their contribution to the socio‐technical risk. The methodology is applied using results from the 2012 Surry Power Station state‐of‐the‐art reactor consequence analysis. A radiological hazard model is generated from MELCOR accident consequence code system, translated into a GIS environment, and combined with the Center for Disease Control social vulnerability index (SVI). This research creates an opportunity to explicitly consider and rank the criticality of location‐specific SVI themes based on their influence on risk, providing input for EPPR.  相似文献   

10.
《Risk analysis》2018,38(8):1576-1584
Fault trees are used in reliability modeling to create logical models of fault combinations that can lead to undesirable events. The output of a fault tree analysis (the top event probability) is expressed in terms of the failure probabilities of basic events that are input to the model. Typically, the basic event probabilities are not known exactly, but are modeled as probability distributions: therefore, the top event probability is also represented as an uncertainty distribution. Monte Carlo methods are generally used for evaluating the uncertainty distribution, but such calculations are computationally intensive and do not readily reveal the dominant contributors to the uncertainty. In this article, a closed‐form approximation for the fault tree top event uncertainty distribution is developed, which is applicable when the uncertainties in the basic events of the model are lognormally distributed. The results of the approximate method are compared with results from two sampling‐based methods: namely, the Monte Carlo method and the Wilks method based on order statistics. It is shown that the closed‐form expression can provide a reasonable approximation to results obtained by Monte Carlo sampling, without incurring the computational expense. The Wilks method is found to be a useful means of providing an upper bound for the percentiles of the uncertainty distribution while being computationally inexpensive compared with full Monte Carlo sampling. The lognormal approximation method and Wilks’s method appear attractive, practical alternatives for the evaluation of uncertainty in the output of fault trees and similar multilinear models.  相似文献   

11.
Treatment of Uncertainty in Performance Assessments for Complex Systems   总被引:13,自引:0,他引:13  
When viewed at a high level, performance assessments (PAs) for complex systems involve two types of uncertainty: stochastic uncertainty, which arises because the system under study can behave in many different ways, and subjective uncertainty, which arises from a lack of knowledge about quantities required within the computational implementation of the PA. Stochastic uncertainty is typically incorporated into a PA with an experimental design based on importance sampling and leads to the final results of the PA being expressed as a complementary cumulative distribution function (CCDF). Subjective uncertainty is usually treated with Monte Carlo techniques and leads to a distribution of CCDFs. This presentation discusses the use of the Kaplan/Garrick ordered triple representation for risk in maintaining a distinction between stochastic and subjective uncertainty in PAs for complex systems. The topics discussed include (1) the definition of scenarios and the calculation of scenario probabilities and consequences, (2) the separation of subjective and stochastic uncertainties, (3) the construction of CCDFs required in comparisons with regulatory standards (e.g., 40 CFR Part 191, Subpart B for the disposal of radioactive waste), and (4) the performance of uncertainty and sensitivity studies. Results obtained in a preliminary PA for the Waste Isolation Pilot Plant, an uncertainty and sensitivity analysis of the MACCS reactor accident consequence analysis model, and the NUREG-1150 probabilistic risk assessments are used for illustration.  相似文献   

12.
Significant research work has been completed in the development of risk-based inservice inspection (ISI) and testing (IST) technology for nuclear power plant applications through the ASME Center For Research and Technology Development. This paper provides technology that has been developed for these engineering applications. The technology includes risk-based ranking methods, beginning with the use of plant probabilistic risk assessment (PRA), for the determination of risk-significant and less risk-significant components for inspection and the determination of similar populations for pumps and valves for inservice testing. Decision analysis methods are outlined for developing ISI and IST programs. This methodology integrates nondestructive examination data, structural reliability/risk assessment results, PRA results, failure data, and expert opinion to evaluate the effectiveness of ISI programs. Similarly, decision analysis uses the output of failure mode and causes analysis in combination with data, expert opinion, and PRA results to evaluate the effectiveness of IST programs. Results of pilot applications of these ASME methods to actual nuclear plant systems and components are summarized. The results of this work are already being used to develop recommended changes in ISI and IST requirements by the ASME Section XI and the ASME Operation and Maintenance Code organizations. A perspective on Code and regulatory adoption is also outlined. Finally, the potential benefits to the nuclear industry in terms of safety, person-rem exposure, and costs are summarized.  相似文献   

13.
A Systematic Uncertainty Analysis of an Evaluative Fate and Exposure Model   总被引:7,自引:0,他引:7  
Multimedia fate and exposure models are widely used to regulate the release of toxic chemicals, to set cleanup standards for contaminated sites, and to evaluate emissions in life-cycle assessment. CalTOX, one of these models, is used to calculate the potential dose, an outcome that is combined with the toxicity of the chemical to determine the Human Toxicity Potential (HTP), used to aggregate and compare emissions. The comprehensive assessment of the uncertainty in the potential dose calculation in this article serves to provide the information necessary to evaluate the reliability of decisions based on the HTP A framework for uncertainty analysis in multimedia risk assessment is proposed and evaluated with four types of uncertainty. Parameter uncertainty is assessed through Monte Carlo analysis. The variability in landscape parameters is assessed through a comparison of potential dose calculations for different regions in the United States. Decision rule uncertainty is explored through a comparison of the HTP values under open and closed system boundaries. Model uncertainty is evaluated through two case studies, one using alternative formulations for calculating the plant concentration and the other testing the steady state assumption for wet deposition. This investigation shows that steady state conditions for the removal of chemicals from the atmosphere are not appropriate and result in an underestimate of the potential dose for 25% of the 336 chemicals evaluated.  相似文献   

14.
System unavailabilities for large complex systems such as nuclear power plants are often evaluated through use of fault tree analysis. The system unavailability is obtained from a Boolean representation of a system fault tree. Even after truncation of higher order terms these expressions can be quite large, involving thousands of terms. A general matrix notation is proposed for the representation of Boolean expressions which facilitates uncertainty and sensitivity analysis calculations.  相似文献   

15.
Currently, there is a trend away from the use of single (often conservative) estimates of risk to summarize the results of risk analyses in favor of stochastic methods which provide a more complete characterization of risk. The use of such stochastic methods leads to a distribution of possible values of risk, taking into account both uncertainty and variability in all of the factors affecting risk. In this article, we propose a general framework for the analysis of uncertainty and variability for use in the commonly encountered case of multiplicative risk models, in which risk may be expressed as a product of two or more risk factors. Our analytical methods facilitate the evaluation of overall uncertainty and variability in risk assessment, as well as the contributions of individual risk factors to both uncertainty and variability which is cumbersome using Monte Carlo methods. The use of these methods is illustrated in the analysis of potential cancer risks due to the ingestion of radon in drinking water.  相似文献   

16.
Assessments of public perceptions of the characteristics of a nuclear power plant accident and affective responses to its likelihood were conducted 5 months before and 1 month after the Chernobyl accident. Analyses of data from 69 residents of southwestern Washington showed significant test-retest correlations for only 10 of 18 variables--accident likelihood, three measures of impact characteristics, three measures of affective reactions, and hazard knowledge by governmental sources. Of these variables, only two had significant changes in mean ratings; frequency of thought and frequency of discussion about a nearby nuclear power plant both increased. While there were significant changes only for two personal consequences (expectations of cancer and genetic effects), both of these decreased. The results of this study indicate that more attention should be given to assessing the stability of risk perceptions over time. Moreover, the data demonstrate that experience with a major accident can actually decrease rather than increase perceptions of threat.  相似文献   

17.
In risk analysis, the treatment of the epistemic uncertainty associated to the probability of occurrence of an event is fundamental. Traditionally, probabilistic distributions have been used to characterize the epistemic uncertainty due to imprecise knowledge of the parameters in risk models. On the other hand, it has been argued that in certain instances such uncertainty may be best accounted for by fuzzy or possibilistic distributions. This seems the case in particular for parameters for which the information available is scarce and of qualitative nature. In practice, it is to be expected that a risk model contains some parameters affected by uncertainties that may be best represented by probability distributions and some other parameters that may be more properly described in terms of fuzzy or possibilistic distributions. In this article, a hybrid method that jointly propagates probabilistic and possibilistic uncertainties is considered and compared with pure probabilistic and pure fuzzy methods for uncertainty propagation. The analyses are carried out on a case study concerning the uncertainties in the probabilities of occurrence of accident sequences in an event tree analysis of a nuclear power plant.  相似文献   

18.
Mitchell J. Small 《Risk analysis》2011,31(10):1561-1575
A methodology is presented for assessing the information value of an additional dosage experiment in existing bioassay studies. The analysis demonstrates the potential reduction in the uncertainty of toxicity metrics derived from expanded studies, providing insights for future studies. Bayesian methods are used to fit alternative dose‐response models using Markov chain Monte Carlo (MCMC) simulation for parameter estimation and Bayesian model averaging (BMA) is used to compare and combine the alternative models. BMA predictions for benchmark dose (BMD) are developed, with uncertainty in these predictions used to derive the lower bound BMDL. The MCMC and BMA results provide a basis for a subsequent Monte Carlo analysis that backcasts the dosage where an additional test group would have been most beneficial in reducing the uncertainty in the BMD prediction, along with the magnitude of the expected uncertainty reduction. Uncertainty reductions are measured in terms of reduced interval widths of predicted BMD values and increases in BMDL values that occur as a result of this reduced uncertainty. The methodology is illustrated using two existing data sets for TCDD carcinogenicity, fitted with two alternative dose‐response models (logistic and quantal‐linear). The example shows that an additional dose at a relatively high value would have been most effective for reducing the uncertainty in BMA BMD estimates, with predicted reductions in the widths of uncertainty intervals of approximately 30%, and expected increases in BMDL values of 5–10%. The results demonstrate that dose selection for studies that subsequently inform dose‐response models can benefit from consideration of how these models will be fit, combined, and interpreted.  相似文献   

19.
Hybrid Processing of Stochastic and Subjective Uncertainty Data   总被引:1,自引:0,他引:1  
Uncertainty analyses typically recognize separate stochastic and subjective sources of uncertainty, but do not systematically combine the two, although a large amount of data used in analyses is partly stochastic and partly subjective. We have developed methodology for mathematically combining stochastic and subjective sources of data uncertainty, based on new "hybrid number" approaches. The methodology can be utilized in conjunction with various traditional techniques, such as PRA (probabilistic risk assessment) and risk analysis decision support. Hybrid numbers have been previously examined as a potential method to represent combinations of stochastic and subjective information, but mathematical processing has been impeded by the requirements inherent in the structure of the numbers, e.g., there was no known way to multiply hybrids. In this paper, we will demonstrate methods for calculating with hybrid numbers that avoid the difficulties. By formulating a hybrid number as a probability distribution that is only fuzzily known, or alternatively as a random distribution of fuzzy numbers, methods are demonstrated for the full suite of arithmetic operations, permitting complex mathematical calculations. It will be shown how information about relative subjectivity (the ratio of subjective to stochastic knowledge about a particular datum) can be incorporated. Techniques are also developed for conveying uncertainty information visually, so that the stochastic and subjective components of the uncertainty, as well as the ratio of knowledge about the two, are readily apparent. The techniques demonstrated have the capability to process uncertainty information for independent, uncorrelated data, and for some types of dependent and correlated data. Example applications are suggested, illustrative problems are shown, and graphical results are given.  相似文献   

20.
Standard statistical methods understate the uncertainty one should attach to effect estimates obtained from observational data. Among the methods used to address this problem are sensitivity analysis, Monte Carlo risk analysis (MCRA), and Bayesian uncertainty assessment. Estimates from MCRAs have been presented as if they were valid frequentist or Bayesian results, but examples show that they need not be either in actual applications. It is concluded that both sensitivity analyses and MCRA should begin with the same type of prior specification effort as Bayesian analysis.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号