首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Jan F. Van Impe 《Risk analysis》2011,31(8):1295-1307
The aim of quantitative microbiological risk assessment is to estimate the risk of illness caused by the presence of a pathogen in a food type, and to study the impact of interventions. Because of inherent variability and uncertainty, risk assessments are generally conducted stochastically, and if possible it is advised to characterize variability separately from uncertainty. Sensitivity analysis allows to indicate to which of the input variables the outcome of a quantitative microbiological risk assessment is most sensitive. Although a number of methods exist to apply sensitivity analysis to a risk assessment with probabilistic input variables (such as contamination, storage temperature, storage duration, etc.), it is challenging to perform sensitivity analysis in the case where a risk assessment includes a separate characterization of variability and uncertainty of input variables. A procedure is proposed that focuses on the relation between risk estimates obtained by Monte Carlo simulation and the location of pseudo‐randomly sampled input variables within the uncertainty and variability distributions. Within this procedure, two methods are used—that is, an ANOVA‐like model and Sobol sensitivity indices—to obtain and compare the impact of variability and of uncertainty of all input variables, and of model uncertainty and scenario uncertainty. As a case study, this methodology is applied to a risk assessment to estimate the risk of contracting listeriosis due to consumption of deli meats.  相似文献   

2.
Currently, there is a trend away from the use of single (often conservative) estimates of risk to summarize the results of risk analyses in favor of stochastic methods which provide a more complete characterization of risk. The use of such stochastic methods leads to a distribution of possible values of risk, taking into account both uncertainty and variability in all of the factors affecting risk. In this article, we propose a general framework for the analysis of uncertainty and variability for use in the commonly encountered case of multiplicative risk models, in which risk may be expressed as a product of two or more risk factors. Our analytical methods facilitate the evaluation of overall uncertainty and variability in risk assessment, as well as the contributions of individual risk factors to both uncertainty and variability which is cumbersome using Monte Carlo methods. The use of these methods is illustrated in the analysis of potential cancer risks due to the ingestion of radon in drinking water.  相似文献   

3.
The uncertainty associated with estimates should be taken into account in quantitative risk assessment. Each input's uncertainty can be characterized through a probabilistic distribution for use under Monte Carlo simulations. In this study, the sampling uncertainty associated with estimating a low proportion on the basis of a small sample size was considered. A common application in microbial risk assessment is the estimation of a prevalence, proportion of contaminated food products, on the basis of few tested units. Three Bayesian approaches (based on beta(0, 0), beta(1/2, 1/2), and beta(l, 1)) and one frequentist approach (based on the frequentist confidence distribution) were compared and evaluated on the basis of simulations. For small samples, we demonstrated some differences between the four tested methods. We concluded that the better method depends on the true proportion of contaminated products, which is by definition unknown in common practice. When no prior information is available, we recommend the beta (1/2, 1/2) prior or the confidence distribution. To illustrate the importance of these differences, the four methods were used in an applied example. We performed two-dimensional Monte Carlo simulations to estimate the proportion of cold smoked salmon packs contaminated by Listeria monocytogenes, one dimension representing within-factory uncertainty, modeled by each of the four studied methods, and the other dimension representing variability between companies.  相似文献   

4.
Standard statistical methods understate the uncertainty one should attach to effect estimates obtained from observational data. Among the methods used to address this problem are sensitivity analysis, Monte Carlo risk analysis (MCRA), and Bayesian uncertainty assessment. Estimates from MCRAs have been presented as if they were valid frequentist or Bayesian results, but examples show that they need not be either in actual applications. It is concluded that both sensitivity analyses and MCRA should begin with the same type of prior specification effort as Bayesian analysis.  相似文献   

5.
Monte Carlo simulations are commonplace in quantitative risk assessments (QRAs). Designed to propagate the variability and uncertainty associated with each individual exposure input parameter in a quantitative risk assessment, Monte Carlo methods statistically combine the individual parameter distributions to yield a single, overall distribution. Critical to such an assessment is the representativeness of each individual input distribution. The authors performed a literature review to collect and compare the distributions used in published QRAs for the parameters of body weight, food consumption, soil ingestion rates, breathing rates, and fluid intake. To provide a basis for comparison, all estimated exposure parameter distributions were evaluated with respect to four properties: consistency, accuracy, precision, and specificity. The results varied depending on the exposure parameter. Even where extensive, well-collected data exist, investigators used a variety of different distributional shapes to approximate these data. Where such data do not exist, investigators have collected their own data, often leading to substantial disparity in parameter estimates and subsequent choice of distribution. The present findings indicate that more attention must be paid to the data underlying these distributional choices. More emphasis should be placed on sensitivity analyses, quantifying the impact of assumptions, and on discussion of sources of variation as part of the presentation of any risk assessment results. If such practices and disclosures are followed, it is believed that Monte Carlo simulations can greatly enhance the accuracy and appropriateness of specific risk assessments. Without such disclosures, researchers will be increasing the size of the risk assessment "black box," a concern already raised by many critics of more traditional risk assessments.  相似文献   

6.
The appearance of measurement error in exposure and risk factor data potentially affects any inferences regarding variability and uncertainty because the distribution representing the observed data set deviates from the distribution that represents an error-free data set. A methodology for improving the characterization of variability and uncertainty with known measurement errors in data is demonstrated in this article based on an observed data set, known measurement error, and a measurement-error model. A practical method for constructing an error-free data set is presented and a numerical method based upon bootstrap pairs, incorporating two-dimensional Monte Carlo simulation, is introduced to address uncertainty arising from measurement error in selected statistics. When measurement error is a large source of uncertainty, substantial differences between the distribution representing variability of the observed data set and the distribution representing variability of the error-free data set will occur. Furthermore, the shape and range of the probability bands for uncertainty differ between the observed and error-free data set. Failure to separately characterize contributions from random sampling error and measurement error will lead to bias in the variability and uncertainty estimates. However, a key finding is that total uncertainty in mean can be properly quantified even if measurement and random sampling errors cannot be separated. An empirical case study is used to illustrate the application of the methodology.  相似文献   

7.
A Monte Carlo procedure for the construction of complementary cumulative distribution functions (CCDFs) for comparison with the U.S. Environmental Protection Agency (EPA) release limits for radioactive waste disposal (40 CFR 191, Subpart B) is described and illustrated with results from a recent performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP). The Monte Carlo procedure produces CCDF estimates similar to those obtained with importance sampling in several recent PAs for the WIPP. The advantages of the Monte Carlo procedure over importance sampling include increased resolution in the calculation of probabilities for complex scenarios involving drilling intrusions and better use of the necessarily limited number of mechanistic calculations that underlie CCDF construction.  相似文献   

8.
Quantitative risk assessment (QRA) models are used to estimate the risks of transporting dangerous goods and to assess the merits of introducing alternative risk reduction measures for different transportation scenarios and assumptions. A comprehensive QRA model recently was developed in Europe for application to road tunnels. This model can assess the merits of a limited number of "native safety measures." In this article, we introduce a procedure for extending its scope to include the treatment of a number of important "nonnative safety measures" of interest to tunnel operators and decisionmakers. Nonnative safety measures were not included in the original model specification. The suggested procedure makes use of expert judgment and Monte Carlo simulation methods to model uncertainty in the revised risk estimates. The results of a case study application are presented that involve the risks of transporting a given volume of flammable liquid through a 10-km road tunnel.  相似文献   

9.
A Monte Carlo simulation is incorporated into a risk assessment for trichloroethylene (TCE) using physiologically-based pharmacokinetic (PBPK) modeling coupled with the linearized multistage model to derive human carcinogenic risk extrapolations. The Monte Carlo technique incorporates physiological parameter variability to produce a statistically derived range of risk estimates which quantifies specific uncertainties associated with PBPK risk assessment approaches. Both inhalation and ingestion exposure routes are addressed. Simulated exposure scenarios were consistent with those used by the Environmental Protection Agency (EPA) in their TCE risk assessment. Mean values of physiological parameters were gathered from the literature for both mice (carcinogenic bioassay subjects) and for humans. Realistic physiological value distributions were assumed using existing data on variability. Mouse cancer bioassay data were correlated to total TCE metabolized and area-under-the-curve (blood concentration) trichloroacetic acid (TCA) as determined by a mouse PBPK model. These internal dose metrics were used in a linearized multistage model analysis to determine dose metric values corresponding to 10-6 lifetime excess cancer risk. Using a human PBPK model, these metabolized doses were then extrapolated to equivalent human exposures (inhalation and ingestion). The Monte Carlo iterations with varying mouse and human physiological parameters produced a range of human exposure concentrations producing a 10-6 risk.  相似文献   

10.
Traditionally, microbial risk assessors have used point estimates to evaluate the probability that an individual will become infected. We developed a quantitative approach that shifts the risk characterization perspective from point estimate to distributional estimate, and from individual to population. To this end, we first designed and implemented a dynamic model that tracks traditional epidemiological variables such as the number of susceptible, infected, diseased, and immune, and environmental variables such as pathogen density. Second, we used a simulation methodology that explicitly acknowledges the uncertainty and variability associated with the data. Specifically, the approach consists of assigning probability distributions to each parameter, sampling from these distributions for Monte Carlo simulations, and using a binary classification to assess the output of each simulation. A case study is presented that explores the uncertainties in assessing the risk of giardiasis when swimming in a recreational impoundment using reclaimed water. Using literature-based information to assign parameters ranges, our analysis demonstrated that the parameter describing the shedding of pathogens by infected swimmers was the factor that contributed most to the uncertainty in risk. The importance of other parameters was dependent on reducing the a priori range of this shedding parameter. By constraining the shedding parameter to its lower subrange, treatment efficiency was the parameter most important in predicting whether a simulation resulted in prevalences above or below non outbreak levels. Whereas parameters associated with human exposure were important when the shedding parameter was constrained to a higher subrange. This Monte Carlo simulation technique identified conditions in which outbreaks and/or nonoutbreaks are likely and identified the parameters that most contributed to the uncertainty associated with a risk prediction.  相似文献   

11.
Risks from exposure to contaminated land are often assessed with the aid of mathematical models. The current probabilistic approach is a considerable improvement on previous deterministic risk assessment practices, in that it attempts to characterize uncertainty and variability. However, some inputs continue to be assigned as precise numbers, while others are characterized as precise probability distributions. Such precision is hard to justify, and we show in this article how rounding errors and distribution assumptions can affect an exposure assessment. The outcome of traditional deterministic point estimates and Monte Carlo simulations were compared to probability bounds analyses. Assigning all scalars as imprecise numbers (intervals prescribed by significant digits) added uncertainty to the deterministic point estimate of about one order of magnitude. Similarly, representing probability distributions as probability boxes added several orders of magnitude to the uncertainty of the probabilistic estimate. This indicates that the size of the uncertainty in such assessments is actually much greater than currently reported. The article suggests that full disclosure of the uncertainty may facilitate decision making in opening up a negotiation window. In the risk analysis process, it is also an ethical obligation to clarify the boundary between the scientific and social domains.  相似文献   

12.
A method is proposed for integrated probabilistic risk assessment where exposure assessment and hazard characterization are both included in a probabilistic way. The aim is to specify the probability that a random individual from a defined (sub)population will have an exposure high enough to cause a particular health effect of a predefined magnitude, the critical effect size ( CES ). The exposure level that results in exactly that CES in a particular person is that person's individual critical effect dose ( ICED ). Individuals in a population typically show variation, both in their individual exposure ( IEXP ) and in their ICED . Both the variation in IEXP and the variation in ICED are quantified in the form of probability distributions. Assuming independence between both distributions, they are combined (by Monte Carlo) into a distribution of the individual margin of exposure ( IMoE ). The proportion of the IMoE distribution below unity is the probability of critical exposure ( PoCE ) in the particular (sub)population. Uncertainties involved in the overall risk assessment (i.e., both regarding exposure and effect assessment) are quantified using Monte Carlo and bootstrap methods. This results in an uncertainty distribution for any statistic of interest, such as the probability of critical exposure ( PoCE ). The method is illustrated based on data for the case of dietary exposure to the organophosphate acephate. We present plots that concisely summarize the probabilistic results, retaining the distinction between variability and uncertainty. We show how the relative contributions from the various sources of uncertainty involved may be quantified.  相似文献   

13.
A general discussion of knowledge dependence in risk calculations shows that the assumption of independence underlying standard Monte Carlo simulation in uncertainty analysis is frequently violated. A model is presented for performing Monte Carlo simulation when the variabilities of the component failure probabilities are either negatively or positively coupled. The model is applied to examples in human reliability analysis and the results are compared to the results of Sandia Laboratories as published in the Peer Review Study and to recalculations using more recent methods of uncertainty analysis.  相似文献   

14.
Indirect exposures to 2,3,7,8-tetrachlorodibenzo- p -dioxin (TCDD) and other toxic materials released in incinerator emissions have been identified as a significant concern for human health. As a result, regulatory agencies and researchers have developed specific approaches for evaluating exposures from indirect pathways. This paper presents a quantitative assessment of the effect of uncertainty and variation in exposure parameters on the resulting estimates of TCDD dose rates received by individuals indirectly exposed to incinerator emissions through the consumption of home-grown beef. The assessment uses a nested Monte Carlo model that separately characterizes uncertainty and variation in dose rate estimates. Uncertainty resulting from limited data on the fate and transport of TCDD are evaluated, and variations in estimated dose rates in the exposed population that result from location-specific parameters and individuals'behaviors are characterized. The analysis indicates that lifetime average daily dose rates for individuals living within 10 km of a hypothetical incinerator range over three orders of magnitude. In contrast, the uncertainty in the dose rate distribution appears to vary by less than one order of magnitude, based on the sources of uncertainty included in this analysis. Current guidance for predicting exposures from indirect exposure pathways was found to overestimate the intakes for typical and high-end individuals.  相似文献   

15.
A quantitative assessment of the exposure to Listeria monocytogenes from cold-smoked salmon (CSS) consumption in France is developed. The general framework is a second-order (or two-dimensional) Monte Carlo simulation, which characterizes the uncertainty and variability of the exposure estimate. The model takes into account the competitive bacterial growth between L. monocytogenes and the background competitive flora from the end of the production line to the consumer phase. An original algorithm is proposed to integrate this growth in conditions of varying temperature. As part of a more general project led by the French Food Safety Agency (Afssa), specific data were acquired and modeled for this quantitative exposure assessment model, particularly time-temperature profiles, prevalence data, and contamination-level data. The sensitivity analysis points out the main influence of the mean temperature in household refrigerators and the prevalence of contaminated CSS on the exposure level. The outputs of this model can be used as inputs for further risk assessment.  相似文献   

16.
This article presents a general model for estimating population heterogeneity and "lack of knowledge" uncertainty in methylmercury (MeHg) exposure assessments using two-dimensional Monte Carlo analysis. Using data from fish-consuming populations in Bangladesh, Brazil, Sweden, and the United Kingdom, predictive model estimates of dietary MeHg exposures were compared against those derived from biomarkers (i.e., [Hg]hair and [Hg]blood). By disaggregating parameter uncertainty into components (i.e., population heterogeneity, measurement error, recall error, and sampling error) estimates were obtained of the contribution of each component to the overall uncertainty. Steady-state diet:hair and diet:blood MeHg exposure ratios were estimated for each population and were used to develop distributions useful for conducting biomarker-based probabilistic assessments of MeHg exposure. The 5th and 95th percentile modeled MeHg exposure estimates around mean population exposure from each of the four study populations are presented to demonstrate lack of knowledge uncertainty about a best estimate for a true mean. Results from a U.K. study population showed that a predictive dietary model resulted in a 74% lower lack of knowledge uncertainty around a central mean estimate relative to a hair biomarker model, and also in a 31% lower lack of knowledge uncertainty around central mean estimate relative to a blood biomarker model. Similar results were obtained for the Brazil and Bangladesh populations. Such analyses, used here to evaluate alternative models of dietary MeHg exposure, can be used to refine exposure instruments, improve information used in site management and remediation decision making, and identify sources of uncertainty in risk estimates.  相似文献   

17.
A novel approach to the quantitative assessment of food-borne risks is proposed. The basic idea is to use Bayesian techniques in two distinct steps: first by constructing a stochastic core model via a Bayesian network based on expert knowledge, and second, using the data available to improve this knowledge. Unlike the Monte Carlo simulation approach as commonly used in quantitative assessment of food-borne risks where data sets are used independently in each module, our consistent procedure incorporates information conveyed by data throughout the chain. It allows "back-calculation" in the food chain model, together with the use of data obtained "downstream" in the food chain. Moreover, the expert knowledge is introduced more simply and consistently than with classical statistical methods. Other advantages of this approach include the clear framework of an iterative learning process, considerable flexibility enabling the use of heterogeneous data, and a justified method to explore the effects of variability and uncertainty. As an illustration, we present an estimation of the probability of contracting a campylobacteriosis as a result of broiler contamination, from the standpoint of quantitative risk assessment. Although the model thus constructed is oversimplified, it clarifies the principles and properties of the method proposed, which demonstrates its ability to deal with quite complex situations and provides a useful basis for further discussions with different experts in the food chain.  相似文献   

18.
Methods for Uncertainty Analysis: A Comparative Survey   总被引:1,自引:0,他引:1  
This paper presents a survey and comparative evaluation of methods which have been developed for the determination of uncertainties in accident consequences and probabilities, for use in probabilistic risk assessment. The methods considered are: analytic techniques, Monte Carlo simulation, response surface approaches, differential sensitivity techniques, and evaluation of classical statistical confidence bounds. It is concluded that only the response surface and differential sensitivity approaches are sufficiently general and flexible for use as overall methods of uncertainty analysis in probabilistic risk assessment. The other methods considered, however, are very useful in particular problems.  相似文献   

19.
《Risk analysis》2018,38(8):1576-1584
Fault trees are used in reliability modeling to create logical models of fault combinations that can lead to undesirable events. The output of a fault tree analysis (the top event probability) is expressed in terms of the failure probabilities of basic events that are input to the model. Typically, the basic event probabilities are not known exactly, but are modeled as probability distributions: therefore, the top event probability is also represented as an uncertainty distribution. Monte Carlo methods are generally used for evaluating the uncertainty distribution, but such calculations are computationally intensive and do not readily reveal the dominant contributors to the uncertainty. In this article, a closed‐form approximation for the fault tree top event uncertainty distribution is developed, which is applicable when the uncertainties in the basic events of the model are lognormally distributed. The results of the approximate method are compared with results from two sampling‐based methods: namely, the Monte Carlo method and the Wilks method based on order statistics. It is shown that the closed‐form expression can provide a reasonable approximation to results obtained by Monte Carlo sampling, without incurring the computational expense. The Wilks method is found to be a useful means of providing an upper bound for the percentiles of the uncertainty distribution while being computationally inexpensive compared with full Monte Carlo sampling. The lognormal approximation method and Wilks’s method appear attractive, practical alternatives for the evaluation of uncertainty in the output of fault trees and similar multilinear models.  相似文献   

20.
This paper demonstrates a new methodology for probabilistic public health risk assessment using the first-order reliability method. The method provides the probability that incremental lifetime cancer risk exceeds a threshold level, and the probabilistic sensitivity quantifying the relative impact of considering the uncertainty of each random variable on the exceedance probability. The approach is applied to a case study given by Thompson et al. (1) on cancer risk caused by ingestion of benzene-contaminated soil, and the results are compared to that of the Monte Carlo method. Parametric sensitivity analyses are conducted to assess the sensitivity of the probabilistic event with respect to the distribution parameters of the basic random variables, such as the mean and standard deviation. The technique is a novel approach to probabilistic risk assessment, and can be used in situations when Monte Carlo analysis is computationally expensive, such as when the simulated risk is at the tail of the risk probability distribution.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号