首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
《Risk analysis》2018,38(8):1576-1584
Fault trees are used in reliability modeling to create logical models of fault combinations that can lead to undesirable events. The output of a fault tree analysis (the top event probability) is expressed in terms of the failure probabilities of basic events that are input to the model. Typically, the basic event probabilities are not known exactly, but are modeled as probability distributions: therefore, the top event probability is also represented as an uncertainty distribution. Monte Carlo methods are generally used for evaluating the uncertainty distribution, but such calculations are computationally intensive and do not readily reveal the dominant contributors to the uncertainty. In this article, a closed‐form approximation for the fault tree top event uncertainty distribution is developed, which is applicable when the uncertainties in the basic events of the model are lognormally distributed. The results of the approximate method are compared with results from two sampling‐based methods: namely, the Monte Carlo method and the Wilks method based on order statistics. It is shown that the closed‐form expression can provide a reasonable approximation to results obtained by Monte Carlo sampling, without incurring the computational expense. The Wilks method is found to be a useful means of providing an upper bound for the percentiles of the uncertainty distribution while being computationally inexpensive compared with full Monte Carlo sampling. The lognormal approximation method and Wilks’s method appear attractive, practical alternatives for the evaluation of uncertainty in the output of fault trees and similar multilinear models.  相似文献   

2.
A general discussion of knowledge dependence in risk calculations shows that the assumption of independence underlying standard Monte Carlo simulation in uncertainty analysis is frequently violated. A model is presented for performing Monte Carlo simulation when the variabilities of the component failure probabilities are either negatively or positively coupled. The model is applied to examples in human reliability analysis and the results are compared to the results of Sandia Laboratories as published in the Peer Review Study and to recalculations using more recent methods of uncertainty analysis.  相似文献   

3.
Traditionally, microbial risk assessors have used point estimates to evaluate the probability that an individual will become infected. We developed a quantitative approach that shifts the risk characterization perspective from point estimate to distributional estimate, and from individual to population. To this end, we first designed and implemented a dynamic model that tracks traditional epidemiological variables such as the number of susceptible, infected, diseased, and immune, and environmental variables such as pathogen density. Second, we used a simulation methodology that explicitly acknowledges the uncertainty and variability associated with the data. Specifically, the approach consists of assigning probability distributions to each parameter, sampling from these distributions for Monte Carlo simulations, and using a binary classification to assess the output of each simulation. A case study is presented that explores the uncertainties in assessing the risk of giardiasis when swimming in a recreational impoundment using reclaimed water. Using literature-based information to assign parameters ranges, our analysis demonstrated that the parameter describing the shedding of pathogens by infected swimmers was the factor that contributed most to the uncertainty in risk. The importance of other parameters was dependent on reducing the a priori range of this shedding parameter. By constraining the shedding parameter to its lower subrange, treatment efficiency was the parameter most important in predicting whether a simulation resulted in prevalences above or below non outbreak levels. Whereas parameters associated with human exposure were important when the shedding parameter was constrained to a higher subrange. This Monte Carlo simulation technique identified conditions in which outbreaks and/or nonoutbreaks are likely and identified the parameters that most contributed to the uncertainty associated with a risk prediction.  相似文献   

4.
A risk assessment was performed to incorporate uncertainty in food processing conditions to develop a risk-based sterilization process design. The focus of this analysis was uncertainty associated with heterogeneous food products. Quartered button mushrooms were the chosen food product because it represents the most typical type. A model for sterilization of spherical particles was utilized, and each parameter's uncertainty was characterized for use under Monte Carlo simulation. Various particle distributions and fluid types were compared. The output of the model was the required sterilization time to achieve the target sterilization conditions with 95% probability. This value was then used to determine the mean fluid velocity for a given tube length. Finally, the output from the model was analyzed to determine the confidence in output based on uncertainty in the input parameters. The model was more sensitive to variation in particle size distribution than fluid type for power-law fluids. The 90% confidence interval included a holding time range of 1 min. With a 95% confidence level that only 8% of the data will be below the target sterilization conditions, a maximum of 9% of the data were expected to achieve double the target level. The results of such an analysis would be useful for management decisions concerning the design of aseptic food processing operations.  相似文献   

5.
6.
This paper demonstrates a new methodology for probabilistic public health risk assessment using the first-order reliability method. The method provides the probability that incremental lifetime cancer risk exceeds a threshold level, and the probabilistic sensitivity quantifying the relative impact of considering the uncertainty of each random variable on the exceedance probability. The approach is applied to a case study given by Thompson et al. (1) on cancer risk caused by ingestion of benzene-contaminated soil, and the results are compared to that of the Monte Carlo method. Parametric sensitivity analyses are conducted to assess the sensitivity of the probabilistic event with respect to the distribution parameters of the basic random variables, such as the mean and standard deviation. The technique is a novel approach to probabilistic risk assessment, and can be used in situations when Monte Carlo analysis is computationally expensive, such as when the simulated risk is at the tail of the risk probability distribution.  相似文献   

7.
Richard Genovesi 《Risk analysis》2012,32(12):2182-2197
Drinking water supplies are at risk of contamination from a variety of physical, chemical, and biological sources. Ranked among these threats are hazardous material releases from leaking or improperly managed underground storage tanks located at municipal, commercial, and industrial facilities. To reduce human health and environmental risks associated with the subsurface storage of hazardous materials, government agencies have taken a variety of legislative and regulatory actions—which date back more than 25 years and include the establishment of rigorous equipment/technology/operational requirements and facility‐by‐facility inspection and enforcement programs. Given a history of more than 470,000 underground storage tank releases nationwide, the U.S. Environmental Protection Agency continues to report that 7,300 new leaks were found in federal fiscal year 2008, while nearly 103,000 old leaks remain to be cleaned up. In this article, we report on an alternate evidence‐based intervention approach for reducing potential releases from the storage of petroleum products (gasoline, diesel, kerosene, heating/fuel oil, and waste oil) in underground tanks at commercial facilities located in Rhode Island. The objective of this study was to evaluate whether a new regulatory model can be used as a cost‐effective alternative to traditional facility‐by‐facility inspection and enforcement programs for underground storage tanks. We conclude that the alternative model, using an emphasis on technical assistance tools, can produce measurable improvements in compliance performance, is a cost‐effective adjunct to traditional facility‐by‐facility inspection and enforcement programs, and has the potential to allow regulatory agencies to decrease their frequency of inspections among low risk facilities without sacrificing compliance performance or increasing public health risks.  相似文献   

8.
Most public health risk assessments assume and combine a series of average, conservative, and worst-case values to derive a conservative point estimate of risk. This procedure has major limitations. This paper demonstrates a new methodology for extended uncertainty analyses in public health risk assessments using Monte Carlo techniques. The extended method begins as do some conventional methods--with the preparation of a spreadsheet to estimate exposure and risk. This method, however, continues by modeling key inputs as random variables described by probability density functions (PDFs). Overall, the technique provides a quantitative way to estimate the probability distributions for exposure and health risks within the validity of the model used. As an example, this paper presents a simplified case study for children playing in soils contaminated with benzene and benzo(a)pyrene (BaP).  相似文献   

9.
Discrete Probability Distributions for Probabilistic Fracture Mechanics   总被引:1,自引:0,他引:1  
Recently, discrete probability distributions (DPDs) have been suggested for use in risk analysis calculations to simplify the numerical computations which must be performed to determine failure probabilities. Specifically, DPDs have been developed to investigate probabilistic functions, that is, functions whose exact form is uncertain. The analysis of defect growth in materials by probabilistic fracture mechanics (PFM) models provides an example in which probabilistic functions play an important role. This paper compares and contrasts Monte Carlo simulation and DPDs as tools for calculating material failure due to fatigue crack growth. For the problem studied, the DPD method takes approximately one third the computation time of the Monte Carlo approach for comparable accuracy. It is concluded that the DPD method has considerable promise in low-failure-probability calculations of importance in risk assessment. In contrast to Monte Carlo, the computation time for the DPD approach is relatively insensitive to the magnitude of the probability being estimated.  相似文献   

10.
In the general framework of quantitative methods for natural‐technological (NaTech) risk analysis, a specific methodology was developed for assessing risks caused by hazardous substances released due to earthquakes. The contribution of accidental scenarios initiated by seismic events to the overall industrial risk was assessed in three case studies derived from the actual plant layout of existing oil refineries. Several specific vulnerability models for different equipment classes were compared and assessed. The effect of differing structural resistances for process equipment on the final risk results was also investigated. The main factors influencing the final risk values resulted from the models for equipment vulnerability and the assumptions for the reference damage states of the process equipment. The analysis of case studies showed that in seismic zones the additional risk deriving from damage caused by earthquakes may be up to more than one order of magnitude higher than that associated to internal failure causes. Critical equipment was determined to be mainly pressurized tanks, even though atmospheric tanks were more vulnerable to containment loss. Failure of minor process equipment having a limited hold‐up of hazardous substances (such as pumps) was shown to have limited influence on the final values of the risk increase caused by earthquakes.  相似文献   

11.
Risks from exposure to contaminated land are often assessed with the aid of mathematical models. The current probabilistic approach is a considerable improvement on previous deterministic risk assessment practices, in that it attempts to characterize uncertainty and variability. However, some inputs continue to be assigned as precise numbers, while others are characterized as precise probability distributions. Such precision is hard to justify, and we show in this article how rounding errors and distribution assumptions can affect an exposure assessment. The outcome of traditional deterministic point estimates and Monte Carlo simulations were compared to probability bounds analyses. Assigning all scalars as imprecise numbers (intervals prescribed by significant digits) added uncertainty to the deterministic point estimate of about one order of magnitude. Similarly, representing probability distributions as probability boxes added several orders of magnitude to the uncertainty of the probabilistic estimate. This indicates that the size of the uncertainty in such assessments is actually much greater than currently reported. The article suggests that full disclosure of the uncertainty may facilitate decision making in opening up a negotiation window. In the risk analysis process, it is also an ethical obligation to clarify the boundary between the scientific and social domains.  相似文献   

12.
Dose‐response models are the essential link between exposure assessment and computed risk values in quantitative microbial risk assessment, yet the uncertainty that is inherent to computed risks because the dose‐response model parameters are estimated using limited epidemiological data is rarely quantified. Second‐order risk characterization approaches incorporating uncertainty in dose‐response model parameters can provide more complete information to decisionmakers by separating variability and uncertainty to quantify the uncertainty in computed risks. Therefore, the objective of this work is to develop procedures to sample from posterior distributions describing uncertainty in the parameters of exponential and beta‐Poisson dose‐response models using Bayes's theorem and Markov Chain Monte Carlo (in OpenBUGS). The theoretical origins of the beta‐Poisson dose‐response model are used to identify a decomposed version of the model that enables Bayesian analysis without the need to evaluate Kummer confluent hypergeometric functions. Herein, it is also established that the beta distribution in the beta‐Poisson dose‐response model cannot address variation among individual pathogens, criteria to validate use of the conventional approximation to the beta‐Poisson model are proposed, and simple algorithms to evaluate actual beta‐Poisson probabilities of infection are investigated. The developed MCMC procedures are applied to analysis of a case study data set, and it is demonstrated that an important region of the posterior distribution of the beta‐Poisson dose‐response model parameters is attributable to the absence of low‐dose data. This region includes beta‐Poisson models for which the conventional approximation is especially invalid and in which many beta distributions have an extreme shape with questionable plausibility.  相似文献   

13.
A screening approach is developed for volatile organic compounds (VOCs) to estimate exposures that correspond to levels measured in fluids and/or tissues in human biomonitoring studies. The approach makes use of a generic physiologically-based pharmacokinetic (PBPK) model coupled with exposure pattern characterization, Monte Carlo analysis, and quantitative structure property relationships (QSPRs). QSPRs are used for VOCs with minimal data to develop chemical-specific parameters needed for the PBPK model. The PBPK model is capable of simulating VOC kinetics following multiple routes of exposure, such as oral exposure via water ingestion and inhalation exposure during shower events. Using published human biomonitoring data of trichloroethylene (TCE), the generic model is evaluated to determine how well it estimates TCE concentrations in blood based on the known drinking water concentrations. In addition, Monte Carlo analysis is conducted to characterize the impact of the following factors: (1) uncertainties in the QSPR-estimated chemical-specific parameters; (2) variability in physiological parameters; and (3) variability in exposure patterns. The results indicate that uncertainty in chemical-specific parameters makes only a minor contribution to the overall variability and uncertainty in the predicted TCE concentrations in blood. The model is used in a reverse dosimetry approach to derive estimates of TCE concentrations in drinking water based on given measurements of TCE in blood, for comparison to the U.S. EPA's Maximum Contaminant Level in drinking water. This example demonstrates how a reverse dosimetry approach can be used to facilitate interpretation of human biomonitoring data in a health risk context by deriving external exposures that are consistent with a biomonitoring data set, thereby permitting comparison with health-based exposure guidelines.  相似文献   

14.
This paper presents an approach for characterizing the probability of adverse effects occurring in a population exposed to dose rates in excess of the Reference Dose (RfD). The approach uses a linear threshold (hockey stick) model of response and is based on the current system of uncertainty factors used in setting RfDs. The approach requires generally available toxicological estimates such as No-Observed-Adverse-Effect Levels (NOAELs) or Benchmark Doses and doses at which adverse effects are observed in 50% of the test animals (ED50s). In this approach, Monte Carlo analysis is used to characterize the uncertainty in the dose response slope based on the range and magnitude of the key sources of uncertainty in setting protective doses. The method does not require information on the shape of the dose response curve for specific chemicals, but is amenable to the inclusion of such data. The approach is applied to four compounds to produce estimates of response rates for dose rates greater than the RfD  相似文献   

15.
A method is proposed for integrated probabilistic risk assessment where exposure assessment and hazard characterization are both included in a probabilistic way. The aim is to specify the probability that a random individual from a defined (sub)population will have an exposure high enough to cause a particular health effect of a predefined magnitude, the critical effect size ( CES ). The exposure level that results in exactly that CES in a particular person is that person's individual critical effect dose ( ICED ). Individuals in a population typically show variation, both in their individual exposure ( IEXP ) and in their ICED . Both the variation in IEXP and the variation in ICED are quantified in the form of probability distributions. Assuming independence between both distributions, they are combined (by Monte Carlo) into a distribution of the individual margin of exposure ( IMoE ). The proportion of the IMoE distribution below unity is the probability of critical exposure ( PoCE ) in the particular (sub)population. Uncertainties involved in the overall risk assessment (i.e., both regarding exposure and effect assessment) are quantified using Monte Carlo and bootstrap methods. This results in an uncertainty distribution for any statistic of interest, such as the probability of critical exposure ( PoCE ). The method is illustrated based on data for the case of dietary exposure to the organophosphate acephate. We present plots that concisely summarize the probabilistic results, retaining the distinction between variability and uncertainty. We show how the relative contributions from the various sources of uncertainty involved may be quantified.  相似文献   

16.
A wide range of uncertainties will be introduced inevitably during the process of performing a safety assessment of engineering systems. The impact of all these uncertainties must be addressed if the analysis is to serve as a tool in the decision-making process. Uncertainties present in the components (input parameters of model or basic events) of model output are propagated to quantify its impact in the final results. There are several methods available in the literature, namely, method of moments, discrete probability analysis, Monte Carlo simulation, fuzzy arithmetic, and Dempster-Shafer theory. All the methods are different in terms of characterizing at the component level and also in propagating to the system level. All these methods have different desirable and undesirable features, making them more or less useful in different situations. In the probabilistic framework, which is most widely used, probability distribution is used to characterize uncertainty. However, in situations in which one cannot specify (1) parameter values for input distributions, (2) precise probability distributions (shape), and (3) dependencies between input parameters, these methods have limitations and are found to be not effective. In order to address some of these limitations, the article presents uncertainty analysis in the context of level-1 probabilistic safety assessment (PSA) based on a probability bounds (PB) approach. PB analysis combines probability theory and interval arithmetic to produce probability boxes (p-boxes), structures that allow the comprehensive propagation through calculation in a rigorous way. A practical case study is also carried out with the developed code based on the PB approach and compared with the two-phase Monte Carlo simulation results.  相似文献   

17.
Mitchell J. Small 《Risk analysis》2011,31(10):1561-1575
A methodology is presented for assessing the information value of an additional dosage experiment in existing bioassay studies. The analysis demonstrates the potential reduction in the uncertainty of toxicity metrics derived from expanded studies, providing insights for future studies. Bayesian methods are used to fit alternative dose‐response models using Markov chain Monte Carlo (MCMC) simulation for parameter estimation and Bayesian model averaging (BMA) is used to compare and combine the alternative models. BMA predictions for benchmark dose (BMD) are developed, with uncertainty in these predictions used to derive the lower bound BMDL. The MCMC and BMA results provide a basis for a subsequent Monte Carlo analysis that backcasts the dosage where an additional test group would have been most beneficial in reducing the uncertainty in the BMD prediction, along with the magnitude of the expected uncertainty reduction. Uncertainty reductions are measured in terms of reduced interval widths of predicted BMD values and increases in BMDL values that occur as a result of this reduced uncertainty. The methodology is illustrated using two existing data sets for TCDD carcinogenicity, fitted with two alternative dose‐response models (logistic and quantal‐linear). The example shows that an additional dose at a relatively high value would have been most effective for reducing the uncertainty in BMA BMD estimates, with predicted reductions in the widths of uncertainty intervals of approximately 30%, and expected increases in BMDL values of 5–10%. The results demonstrate that dose selection for studies that subsequently inform dose‐response models can benefit from consideration of how these models will be fit, combined, and interpreted.  相似文献   

18.
A. E. Ades  G. Lu 《Risk analysis》2003,23(6):1165-1172
Monte Carlo simulation has become the accepted method for propagating parameter uncertainty through risk models. It is widely appreciated, however, that correlations between input variables must be taken into account if models are to deliver correct assessments of uncertainty in risk. Various two-stage methods have been proposed that first estimate a correlation structure and then generate Monte Carlo simulations, which incorporate this structure while leaving marginal distributions of parameters unchanged. Here we propose a one-stage alternative, in which the correlation structure is estimated from the data directly by Bayesian Markov Chain Monte Carlo methods. Samples from the posterior distribution of the outputs then correctly reflect the correlation between parameters, given the data and the model. Besides its computational simplicity, this approach utilizes the available evidence from a wide variety of structures, including incomplete data and correlated and uncorrelated repeat observations. The major advantage of a Bayesian approach is that, rather than assuming the correlation structure is fixed and known, it captures the joint uncertainty induced by the data in all parameters, including variances and covariances, and correctly propagates this through the decision or risk model. These features are illustrated with examples on emissions of dioxin congeners from solid waste incinerators.  相似文献   

19.
Jan F. Van Impe 《Risk analysis》2011,31(8):1295-1307
The aim of quantitative microbiological risk assessment is to estimate the risk of illness caused by the presence of a pathogen in a food type, and to study the impact of interventions. Because of inherent variability and uncertainty, risk assessments are generally conducted stochastically, and if possible it is advised to characterize variability separately from uncertainty. Sensitivity analysis allows to indicate to which of the input variables the outcome of a quantitative microbiological risk assessment is most sensitive. Although a number of methods exist to apply sensitivity analysis to a risk assessment with probabilistic input variables (such as contamination, storage temperature, storage duration, etc.), it is challenging to perform sensitivity analysis in the case where a risk assessment includes a separate characterization of variability and uncertainty of input variables. A procedure is proposed that focuses on the relation between risk estimates obtained by Monte Carlo simulation and the location of pseudo‐randomly sampled input variables within the uncertainty and variability distributions. Within this procedure, two methods are used—that is, an ANOVA‐like model and Sobol sensitivity indices—to obtain and compare the impact of variability and of uncertainty of all input variables, and of model uncertainty and scenario uncertainty. As a case study, this methodology is applied to a risk assessment to estimate the risk of contracting listeriosis due to consumption of deli meats.  相似文献   

20.
Hurricane Katrina struck an area dense with industry, causing numerous releases of petroleum and hazardous materials. This study integrates information from a number of sources to describe the frequency, causes, and effects of these releases in order to inform analysis of risk from future hurricanes. Over 200 onshore releases of hazardous chemicals, petroleum, or natural gas were reported. Storm surge was responsible for the majority of petroleum releases and failure of storage tanks was the most common mechanism of release. Of the smaller number of hazardous chemical releases reported, many were associated with flaring from plant startup, shutdown, or process upset. In areas impacted by storm surge, 10% of the facilities within the Risk Management Plan (RMP) and Toxic Release Inventory (TRI) databases and 28% of SIC 1311 facilities experienced accidental releases. In areas subject only to hurricane strength winds, a lower fraction (1% of RMP and TRI and 10% of SIC 1311 facilities) experienced a release while 1% of all facility types reported a release in areas that experienced tropical storm strength winds. Of industrial facilities surveyed, more experienced indirect disruptions such as displacement of workers, loss of electricity and communication systems, and difficulty acquiring supplies and contractors for operations or reconstruction (55%), than experienced releases. To reduce the risk of hazardous material releases and speed the return to normal operations under these difficult conditions, greater attention should be devoted to risk‐based facility design and improved prevention and response planning.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号