首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 453 毫秒
1.
Jan F. Van Impe 《Risk analysis》2011,31(8):1295-1307
The aim of quantitative microbiological risk assessment is to estimate the risk of illness caused by the presence of a pathogen in a food type, and to study the impact of interventions. Because of inherent variability and uncertainty, risk assessments are generally conducted stochastically, and if possible it is advised to characterize variability separately from uncertainty. Sensitivity analysis allows to indicate to which of the input variables the outcome of a quantitative microbiological risk assessment is most sensitive. Although a number of methods exist to apply sensitivity analysis to a risk assessment with probabilistic input variables (such as contamination, storage temperature, storage duration, etc.), it is challenging to perform sensitivity analysis in the case where a risk assessment includes a separate characterization of variability and uncertainty of input variables. A procedure is proposed that focuses on the relation between risk estimates obtained by Monte Carlo simulation and the location of pseudo‐randomly sampled input variables within the uncertainty and variability distributions. Within this procedure, two methods are used—that is, an ANOVA‐like model and Sobol sensitivity indices—to obtain and compare the impact of variability and of uncertainty of all input variables, and of model uncertainty and scenario uncertainty. As a case study, this methodology is applied to a risk assessment to estimate the risk of contracting listeriosis due to consumption of deli meats.  相似文献   

2.
Currently, there is a trend away from the use of single (often conservative) estimates of risk to summarize the results of risk analyses in favor of stochastic methods which provide a more complete characterization of risk. The use of such stochastic methods leads to a distribution of possible values of risk, taking into account both uncertainty and variability in all of the factors affecting risk. In this article, we propose a general framework for the analysis of uncertainty and variability for use in the commonly encountered case of multiplicative risk models, in which risk may be expressed as a product of two or more risk factors. Our analytical methods facilitate the evaluation of overall uncertainty and variability in risk assessment, as well as the contributions of individual risk factors to both uncertainty and variability which is cumbersome using Monte Carlo methods. The use of these methods is illustrated in the analysis of potential cancer risks due to the ingestion of radon in drinking water.  相似文献   

3.
This article presents a process for an integrated policy analysis that combines risk assessment and benefit-cost analysis. This concept, which explicitly combines the two types of related analyses, seems to contradict the long-accepted risk analysis paradigm of separating risk assessment and risk management since benefit-cost analysis is generally considered to be a part of risk management. Yet that separation has become a problem because benefit-cost analysis uses risk assessment results as a starting point and considerable debate over the last several years focused on the incompatibility of the use of upper bounds or "safe" point estimates in many risk assessments with benefit-cost analysis. The problem with these risk assessments is that they ignore probabilistic information. As advanced probabilistic techniques for risk assessment emerge and economic analysts receive distributions of risks instead of point estimates, the artificial separation between risk analysts and the economic/decision analysts complicates the overall analysis. In addition, recent developments in countervailing risk theory suggest that combining the risk and benefit-cost analyses is required to fully understand the complexity of choices and tradeoffs faced by the decisionmaker. This article also argues that the separation of analysis and management is important, but that benefit-cost analysis has been wrongly classified into the risk management category and that the analytical effort associated with understanding the economic impacts of risk reduction actions need to be part of a broader risk assessment process.  相似文献   

4.
Probabilistic seismic risk analysis is a well‐established method in the insurance industry for modeling portfolio losses from earthquake events. In this context, precise exposure locations are often unknown. However, so far, location uncertainty has not been in the focus of a large amount of research. In this article, we propose a novel framework for treatment of location uncertainty. As a case study, a large number of synthetic portfolios resembling typical real‐world cases were created. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on the variability of loss frequency estimations. The results indicate that due to loss aggregation effects and spatial hazard variability, location uncertainty in isolation and in conjunction with ground motion uncertainty can induce significant variability to probabilistic loss results, especially for portfolios with a small number of risks. After quantifying its effect, we conclude that location uncertainty should not be neglected when assessing probabilistic seismic risk, but should be treated stochastically and the resulting variability should be visualized and interpreted carefully.  相似文献   

5.
Human variability is a very important factor considered in human health risk assessment for protecting sensitive populations from chemical exposure. Traditionally, to account for this variability, an interhuman uncertainty factor is applied to lower the exposure limit. However, using a fixed uncertainty factor rather than probabilistically accounting for human variability can hardly support probabilistic risk assessment advocated by a number of researchers; new methods are needed to probabilistically quantify human population variability. We propose a Bayesian hierarchical model to quantify variability among different populations. This approach jointly characterizes the distribution of risk at background exposure and the sensitivity of response to exposure, which are commonly represented by model parameters. We demonstrate, through both an application to real data and a simulation study, that using the proposed hierarchical structure adequately characterizes variability across different populations.  相似文献   

6.
Risks from exposure to contaminated land are often assessed with the aid of mathematical models. The current probabilistic approach is a considerable improvement on previous deterministic risk assessment practices, in that it attempts to characterize uncertainty and variability. However, some inputs continue to be assigned as precise numbers, while others are characterized as precise probability distributions. Such precision is hard to justify, and we show in this article how rounding errors and distribution assumptions can affect an exposure assessment. The outcome of traditional deterministic point estimates and Monte Carlo simulations were compared to probability bounds analyses. Assigning all scalars as imprecise numbers (intervals prescribed by significant digits) added uncertainty to the deterministic point estimate of about one order of magnitude. Similarly, representing probability distributions as probability boxes added several orders of magnitude to the uncertainty of the probabilistic estimate. This indicates that the size of the uncertainty in such assessments is actually much greater than currently reported. The article suggests that full disclosure of the uncertainty may facilitate decision making in opening up a negotiation window. In the risk analysis process, it is also an ethical obligation to clarify the boundary between the scientific and social domains.  相似文献   

7.
A method is proposed for integrated probabilistic risk assessment where exposure assessment and hazard characterization are both included in a probabilistic way. The aim is to specify the probability that a random individual from a defined (sub)population will have an exposure high enough to cause a particular health effect of a predefined magnitude, the critical effect size ( CES ). The exposure level that results in exactly that CES in a particular person is that person's individual critical effect dose ( ICED ). Individuals in a population typically show variation, both in their individual exposure ( IEXP ) and in their ICED . Both the variation in IEXP and the variation in ICED are quantified in the form of probability distributions. Assuming independence between both distributions, they are combined (by Monte Carlo) into a distribution of the individual margin of exposure ( IMoE ). The proportion of the IMoE distribution below unity is the probability of critical exposure ( PoCE ) in the particular (sub)population. Uncertainties involved in the overall risk assessment (i.e., both regarding exposure and effect assessment) are quantified using Monte Carlo and bootstrap methods. This results in an uncertainty distribution for any statistic of interest, such as the probability of critical exposure ( PoCE ). The method is illustrated based on data for the case of dietary exposure to the organophosphate acephate. We present plots that concisely summarize the probabilistic results, retaining the distinction between variability and uncertainty. We show how the relative contributions from the various sources of uncertainty involved may be quantified.  相似文献   

8.
The crashes of four hijacked commercial planes on September 11, 2001, and the repeated televised images of the consequent collapse of the World Trade Center and one side of the Pentagon will inevitably change people's perceptions of the mortality risks to people on the ground from crashing airplanes. Goldstein and colleagues were the first to quantify the risk for Americans of being killed on the ground from a crashing airplane for unintentional events, providing average point estimates of 6 in a hundred million for annual risk and 4.2 in a million for lifetime risk. They noted that the lifetime risk result exceeded the commonly used risk management threshold of 1 in a million, and suggested that the risk to "groundlings" could be a useful risk communication tool because (a) it is a man-made risk (b) arising from economic activities (c) from which the victims derive no benefit and (d) exposure to which the victims cannot control. Their results have been used in risk communication. This analysis provides updated estimates of groundling fatality risks from unintentional crashes using more recent data and a geographical information system approach to modeling the population around airports. The results suggest that the average annual risk is now 1.2 in a hundred million and the lifetime risk is now 9 in ten million (below the risk management threshold). Analysis of the variability and uncertainty of this estimate, however, suggests that the exposure to groundling fatality risk varies by about a factor of approximately 100 in the spatial dimension of distance to an airport, with the risk declining rapidly outside the first 2 miles around an airport. We believe that the risk to groundlings from crashing airplanes is more useful in the context of risk communication when information about variability and uncertainty in the risk estimates is characterized, but we suspect that recent events will alter its utility in risk communication.  相似文献   

9.
A call for risk assessment approaches that better characterize and quantify uncertainty has been made by the scientific and regulatory community. This paper responds to that call by demonstrating a distributional approach that draws upon human data to derive potency estimates and to identify and quantify important sources of uncertainty. The approach is rooted in the science of decision analysis and employs an influence diagram, a decision tree, probabilistic weights, and a distribution of point estimates of carcinogenic potency. Its results estimate the likelihood of different carcinogenic risks (potencies) for a chemical under a specific scenario. For this exercise, human data on formaldehyde were employed to demonstrate the approach. Sensitivity analyses were performed to determine the relative impact of specific levels and alternatives on the potency distribution. The resulting potency estimates are compared with the results of an exercise using animal data on formaldehyde. The paper demonstrates that distributional risk assessment is readily adapted to situations in which epidemiologic data serve as the basis for potency estimates. Strengths and weaknesses of the distributional approach are discussed. Areas for further application and research are recommended.  相似文献   

10.
Recent work in the assessment of risk in maritime transportation systems has used simulation-based probabilistic risk assessment techniques. In the Prince William Sound and Washington State Ferries risk assessments, the studies' recommendations were backed up by estimates of their impact made using such techniques and all recommendations were implemented. However, the level of uncertainty about these estimates was not available, leaving the decisionmakers unsure whether the evidence was sufficient to assess specific risks and benefits. The first step toward assessing the impact of uncertainty in maritime risk assessments is to model the uncertainty in the simulation models used. In this article, a study of the impact of proposed ferry service expansions in San Francisco Bay is used as a case study to demonstrate the use of Bayesian simulation techniques to propagate uncertainty throughout the analysis. The conclusions drawn in the original study are shown, in this case, to be robust to the inherent uncertainties. The main intellectual merit of this work is the development of Bayesian simulation technique to model uncertainty in the assessment of maritime risk. However, Bayesian simulations have been implemented only as theoretical demonstrations. Their use in a large, complex system may be considered state of the art in the field of computational sciences.  相似文献   

11.
Reliability and higher levels of safety are thought to be achieved by using systematic approaches to managing risks. The assessment of risks has produced a range of different approaches to assessing these uncertainties, presenting models for how risks affect individuals or organizations. Contemporary risk assessment tools based on this approach have proven difficult for practitioners to use as tools for tactical and operational decision making. This article presents an alternative to these assessments by utilizing a resilience perspective, arguing that complex systems are inclined to variety and uncertainty regarding the results they produce and are therefore prone to systemic failures. A continuous improvement approach is a source of reliability when managing complex systems and is necessary to manage varieties and uncertainties. For an organization to understand how risk events occur, it is necessary to define what is believed to be the equilibrium of the system in time and space. By applying a resilience engineering (RE) perspective to risk assessment, it is possible to manage this complexity by assessing the ability to respond, monitor, learn, and anticipate risks, and in so doing to move away from the flawed frequency and consequences approach. Using a research station network in the Arctic as an example illustrates how an RE approach qualifies assessments by bridging risk assessments with value-creation processes. The article concludes by arguing that a resilience-based risk assessment can improve on current practice, including for organizations located outside the Arctic region.  相似文献   

12.
At the request of the U.S. Environmental Protection Agency (EPA), the National Research Council (NRC) recently completed a major report, Science and Decisions: Advancing Risk Assessment, that is intended to strengthen the scientific basis, credibility, and effectiveness of risk assessment practices and subsequent risk management decisions. The report describes the challenges faced by risk assessment and the need to consider improvements in both the technical analyses of risk assessments (i.e., the development and use of scientific information to improve risk characterization) and the utility of risk assessments (i.e., making assessments more relevant and useful for risk management decisions). The report tackles a number of topics relating to improvements in the process, including the design and framing of risk assessments, uncertainty and variability characterization, selection and use of defaults, unification of cancer and noncancer dose‐response assessment, cumulative risk assessment, and the need to increase EPA's capacity to address these improvements. This article describes and summarizes the NRC report, with an eye toward its implications for risk assessment practices at EPA.  相似文献   

13.
Dose‐response models are the essential link between exposure assessment and computed risk values in quantitative microbial risk assessment, yet the uncertainty that is inherent to computed risks because the dose‐response model parameters are estimated using limited epidemiological data is rarely quantified. Second‐order risk characterization approaches incorporating uncertainty in dose‐response model parameters can provide more complete information to decisionmakers by separating variability and uncertainty to quantify the uncertainty in computed risks. Therefore, the objective of this work is to develop procedures to sample from posterior distributions describing uncertainty in the parameters of exponential and beta‐Poisson dose‐response models using Bayes's theorem and Markov Chain Monte Carlo (in OpenBUGS). The theoretical origins of the beta‐Poisson dose‐response model are used to identify a decomposed version of the model that enables Bayesian analysis without the need to evaluate Kummer confluent hypergeometric functions. Herein, it is also established that the beta distribution in the beta‐Poisson dose‐response model cannot address variation among individual pathogens, criteria to validate use of the conventional approximation to the beta‐Poisson model are proposed, and simple algorithms to evaluate actual beta‐Poisson probabilities of infection are investigated. The developed MCMC procedures are applied to analysis of a case study data set, and it is demonstrated that an important region of the posterior distribution of the beta‐Poisson dose‐response model parameters is attributable to the absence of low‐dose data. This region includes beta‐Poisson models for which the conventional approximation is especially invalid and in which many beta distributions have an extreme shape with questionable plausibility.  相似文献   

14.
Terje Aven  Enrico Zio 《Risk analysis》2014,34(7):1164-1172
This is a perspective article on foundational issues in risk assessment and management. The aim is to discuss the needs, obstacles, and challenges for the establishment of a renewed, strong scientific foundation for risk assessment and risk management suited for the current and future technological challenges. The focus is on (i) reviewing and discussing the present situation and (ii) identifying how to best proceed in the future, to develop the risk discipline in the directions needed. The article provides some reflections on the interpretation and understanding of the concept of “foundations of risk assessment and risk management” and the challenges therein. One main recommendation is that different arenas and moments for discussion are needed to specifically address foundational issues in a way that embraces the many disciplinary communities involved (from social scientists to engineers, from behavioral scientists to statisticians, from health physicists to lawyers, etc.). One such opportunity is sought in the constitution of a novel specialty group of the Society of Risk Analysis.  相似文献   

15.
Biomarkers such as DNA adducts have significant potential to improve quantitative risk assessment by characterizing individual differences in metabolism of genotoxins and DNA repair and accounting for some of the factors that could affect interindividual variation in cancer risk. Inherent uncertainty in laboratory measurements and within-person variability of DNA adduct levels over time are putatively unrelated to cancer risk and should be subtracted from observed variation to better estimate interindividual variability of response to carcinogen exposure. A total of 41 volunteers, both smokers and nonsmokers, were asked to provide a peripheral blood sample every 3 weeks for several months in order to specifically assess intraindividual variability of polycyclic aromatic hydrocarbon (PAH)-DNA adduct levels. The intraindividual variance in PAH-DNA adduct levels, together with measurement uncertainty (laboratory variability and unaccounted for differences in exposure), constituted roughly 30% of the overall variance. An estimated 70% of the total variance was contributed by interindividual variability and is probably representative of the true biologic variability of response to carcinogenic exposure in lymphocytes. The estimated interindividual variability in DNA damage after subtracting intraindividual variability and measurement uncertainty was 24-fold. Inter-individual variance was higher (52-fold) in persons who constitutively lack the Glutathione S-Transferase M1 (GSTM1) gene which is important in the detoxification pathway of PAH. Risk assessment models that do not consider the variability of susceptibility to DNA damage following carcinogen exposure may underestimate risks to the general population, especially for those people who are most vulnerable.  相似文献   

16.
Concern about the degree of uncertainty and potential conservatism in deterministic point estimates of risk has prompted researchers to turn increasingly to probabilistic methods for risk assessment. With Monte Carlo simulation techniques, distributions of risk reflecting uncertainty and/or variability are generated as an alternative. In this paper the compounding of conservatism(1) between the level associated with point estimate inputs selected from probability distributions and the level associated with the deterministic value of risk calculated using these inputs is explored. Two measures of compounded conservatism are compared and contrasted. The first measure considered, F , is defined as the ratio of the risk value, R d, calculated deterministically as a function of n inputs each at the j th percentile of its probability distribution, and the risk value, R j that falls at the j th percentile of the simulated risk distribution (i.e., F=Rd/Rj). The percentile of the simulated risk distribution which corresponds to the deterministic value, Rd , serves as a second measure of compounded conservatism. Analytical results for simple products of lognormal distributions are presented. In addition, a numerical treatment of several complex cases is presented using five simulation analyses from the literature to illustrate. Overall, there are cases in which conservatism compounds dramatically for deterministic point estimates of risk constructed from upper percentiles of input parameters, as well as those for which the effect is less notable. The analytical and numerical techniques discussed are intended to help analysts explore the factors that influence the magnitude of compounding conservatism in specific cases.  相似文献   

17.
18.
A Latin Hypercube probabilistic risk assessment methodology was employed in the assessment of health risks associated with exposures to contaminated sediment and biota in an estuary in the Tidewater region of Virginia. The primary contaminants were polychlorinated biphenyls (PCBs), polychlorinated terphenyls (PCTs), polynuclear aromatic hydrocarbons (PAHs), and metals released into the estuary from a storm sewer system. The exposure pathways associated with the highest contaminant intake and risks were dermal contact with contaminated sediment and ingestion of contaminated aquatic and terrestrial biota from the contaminated area. As expected, all of the output probability distributions of risk were highly skewed, and the ratios of the expected value (mean) to median risk estimates ranged from 1.4 to 14.8 for the various exposed populations. The 99th percentile risk estimates were as much as two orders of magnitude above the mean risk estimates. For the sediment exposure pathways, the stability of the median risk estimates was found to be much greater than the stability of the expected value risk estimates. The interrun variability in the median risk estimate was found to be +/-1.9% at 3000 iterations. The interrun stability of the mean risk estimates was found to be approximately equal to that of the 95th percentile estimates at any number of iterations. The variation in neither contaminant concentrations nor any other single input variable contributed disproportionately to the overall simulation variance. The inclusion or exclusion of spatial correlations among contaminant concentrations in the simulation model did not significantly effect either the magnitude or the variance of the simulation risk estimates for sediment exposures.  相似文献   

19.
If the point of view is adopted that in calculations of real-world phenomena we almost invariably have significant uncertainty in the numerical values of our parameters, then, in these calculations, numerical quantities should be replaced by probability distributions and mathematical operations between these quantities should be replaced by analogous operations between probability distributions. Also, practical calculations one way or another always require discretization or truncation. Combining these two thoughts leads to a numerical approach to probabilistic calculations having great simplicity, power, and elegance. The philosophy and technique of this approach is described, some pitfalls are pointed out, and an application to seismic risk assessment is outlined.  相似文献   

20.
GM Foods and the Misperception of Risk Perception   总被引:3,自引:0,他引:3  
Public opposition to genetically modified (GM) food and crops is widely interpreted as the result of the public's misperception of the risks. With scientific assessment pointing to no unique risks from GM crops and foods, a strategy of accurate risk communication from trusted sources has been advocated. This is based on the assumption that the benefits of GM crops and foods are self-evident. Informed by the interpretation of some qualitative interviews with lay people, we use data from the Eurobarometer survey on biotechnology to explore the hypothesis that it is not so much the perception of risks as the absence of benefits that is the basis of the widespread rejection of GM foods and crops by the European public. Some respondents perceive both risks and benefits, and may be trading off these attributes along the lines of a rational choice model. However, for others, one attribute-benefit-appears to dominate their judgments: the lexicographic heuristic. For these respondents, their perception of risk is of limited importance in the formation of attitudes toward GM food and crops. The implication is that the absence of perceived benefits from GM foods and crops calls into question the relevance of risk communication strategies for bringing about change in public opinion.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号