首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Management of invasive species depends on developing prevention and control strategies through comprehensive risk assessment frameworks that need a thorough analysis of exposure to invasive species. However, accurate exposure analysis of invasive species can be a daunting task because of the inherent uncertainty in invasion processes. Risk assessment of invasive species under uncertainty requires potential integration of expert judgment with empirical information, which often can be incomplete, imprecise, and fragmentary. The representation of knowledge in classical risk models depends on the formulation of a precise probabilistic value or well-defined joint distribution of unknown parameters. However, expert knowledge and judgments are often represented in value-laden terms or preference-ordered criteria. We offer a novel approach to risk assessment by using a dominance-based rough set approach to account for preference order in the domains of attributes in the set of risk classes. The model is illustrated with an example showing how a knowledge-centric risk model can be integrated with the dominance-based principle of rough set to derive minimal covering "if ... , then...," decision rules to reason over a set of possible invasion scenarios. The inconsistency and ambiguity in the data set is modeled using the rough set concept of boundary region adjoining lower and upper approximation of risk classes. Finally, we present an extension of rough set to evidence a theoretic interpretation of risk measures of invasive species in a spatial context. In this approach, the multispecies interactions in an invasion risk are approximated with imprecise probability measures through a combination of spatial neighborhood information of risk estimation in terms of belief and plausibility.  相似文献   

2.
A probabilistic and interdisciplinary risk–benefit assessment (RBA) model integrating microbiological, nutritional, and chemical components was developed for infant milk, with the objective of predicting the health impact of different scenarios of consumption. Infant feeding is a particular concern of interest in RBA as breast milk and powder infant formula have both been associated with risks and benefits related to chemicals, bacteria, and nutrients, hence the model considers these three facets. Cronobacter sakazakii, dioxin‐like polychlorinated biphenyls (dl‐PCB), and docosahexaenoic acid (DHA) were three risk/benefit factors selected as key issues in microbiology, chemistry, and nutrition, respectively. The present model was probabilistic with variability and uncertainty separated using a second‐order Monte Carlo simulation process. In this study, advantages and limitations of undertaking probabilistic and interdisciplinary RBA are discussed. In particular, the probabilistic technique was found to be powerful in dealing with missing data and to translate assumptions into quantitative inputs while taking uncertainty into account. In addition, separation of variability and uncertainty strengthened the interpretation of the model outputs by enabling better consideration and distinction of natural heterogeneity from lack of knowledge. Interdisciplinary RBA is necessary to give more structured conclusions and avoid contradictory messages to policymakers and also to consumers, leading to more decisive food recommendations. This assessment provides a conceptual development of the RBA methodology and is a robust basis on which to build upon.  相似文献   

3.
Dose–response modeling of biological agents has traditionally focused on describing laboratory‐derived experimental data. Limited consideration has been given to understanding those factors that are controlled in a laboratory, but are likely to occur in real‐world scenarios. In this study, a probabilistic framework is developed that extends Brookmeyer's competing‐risks dose–response model to allow for variation in factors such as dose‐dispersion, dose‐deposition, and other within‐host parameters. With data sets drawn from dose–response experiments of inhalational anthrax, plague, and tularemia, we illustrate how for certain cases, there is the potential for overestimation of infection numbers arising from models that consider only the experimental data in isolation.  相似文献   

4.
Accidents with automatic production systems are reported to be on the order of one in a hundred or thousand robot-years, while fatal accidents are found to occur one or two orders of magnitude less frequently. Traditions in occupational safety tend to seek for safety targets in terms of zero severe accidents for automatic systems. Decision-making requires a risk assessment balancing potential risk reduction measures and costs within the cultural environment of a production company. This paper presents a simplified procedure which acts as a decision tool. The procedure is based on a risk concept approaching prevention both in a deterministic and in a probabilistic manner. Eight accident scenarios are shown to represent the potential accident processes involving robot interactions with people. Seven prevention policies are shown to cover the accident scenarios in principle. An additional probabilistic approach may indicate which extra safety measures can be taken against what risk reduction and additional costs. The risk evaluation process aims at achieving a quantitative acceptable risk level. For that purpose, three risk evaluation methods are discussed with respect to reaching broad consensus on the safety targets.  相似文献   

5.
The uncertainty associated with estimates should be taken into account in quantitative risk assessment. Each input's uncertainty can be characterized through a probabilistic distribution for use under Monte Carlo simulations. In this study, the sampling uncertainty associated with estimating a low proportion on the basis of a small sample size was considered. A common application in microbial risk assessment is the estimation of a prevalence, proportion of contaminated food products, on the basis of few tested units. Three Bayesian approaches (based on beta(0, 0), beta(1/2, 1/2), and beta(l, 1)) and one frequentist approach (based on the frequentist confidence distribution) were compared and evaluated on the basis of simulations. For small samples, we demonstrated some differences between the four tested methods. We concluded that the better method depends on the true proportion of contaminated products, which is by definition unknown in common practice. When no prior information is available, we recommend the beta (1/2, 1/2) prior or the confidence distribution. To illustrate the importance of these differences, the four methods were used in an applied example. We performed two-dimensional Monte Carlo simulations to estimate the proportion of cold smoked salmon packs contaminated by Listeria monocytogenes, one dimension representing within-factory uncertainty, modeled by each of the four studied methods, and the other dimension representing variability between companies.  相似文献   

6.
Human variability is a very important factor considered in human health risk assessment for protecting sensitive populations from chemical exposure. Traditionally, to account for this variability, an interhuman uncertainty factor is applied to lower the exposure limit. However, using a fixed uncertainty factor rather than probabilistically accounting for human variability can hardly support probabilistic risk assessment advocated by a number of researchers; new methods are needed to probabilistically quantify human population variability. We propose a Bayesian hierarchical model to quantify variability among different populations. This approach jointly characterizes the distribution of risk at background exposure and the sensitivity of response to exposure, which are commonly represented by model parameters. We demonstrate, through both an application to real data and a simulation study, that using the proposed hierarchical structure adequately characterizes variability across different populations.  相似文献   

7.
8.
Traditional probabilistic risk assessment (PRA), of the type originally developed for engineered systems, is still proposed for terrorism risk analysis. We show that such PRA applications are unjustified in general. The capacity of terrorists to seek and use information and to actively research different attack options before deciding what to do raises unique features of terrorism risk assessment that are not adequately addressed by conventional PRA for natural and engineered systems—in part because decisions based on such PRA estimates do not adequately hedge against the different probabilities that attackers may eventually act upon. These probabilities may differ from the defender's (even if the defender's experts are thoroughly trained, well calibrated, unbiased probability assessors) because they may be conditioned on different information. We illustrate the fundamental differences between PRA and terrorism risk analysis, and suggest use of robust decision analysis for risk management when attackers may know more about some attack options than we do.  相似文献   

9.
Hickey GL  Craig PS 《Risk analysis》2012,32(7):1232-1243
A species sensitivity distribution (SSD) models data on toxicity of a specific toxicant to species in a defined assemblage. SSDs are typically assumed to be parametric, despite noteworthy criticism, with a standard proposal being the log-normal distribution. Recently, and confusingly, there have emerged different statistical methods in the ecotoxicological risk assessment literature, independent of the distributional assumption, for fitting SSDs to toxicity data with the overall aim of estimating the concentration of the toxicant that is hazardous to % of the biological assemblage (usually with small). We analyze two such estimators derived from simple linear regression applied to the ordered log-transformed toxicity data values and probit transformed rank-based plotting positions. These are compared to the more intuitive and statistically defensible confidence limit-based estimator. We conclude based on a large-scale simulation study that the latter estimator should be used in typical assessments where a pointwise value of the hazardous concentration is required.  相似文献   

10.
Current methods for cancer risk assessment result in single values, without any quantitative information on the uncertainties in these values. Therefore, single risk values could easily be overinterpreted. In this study, we discuss a full probabilistic cancer risk assessment approach in which all the generally recognized uncertainties in both exposure and hazard assessment are quantitatively characterized and probabilistically evaluated, resulting in a confidence interval for the final risk estimate. The methodology is applied to three example chemicals (aflatoxin, N‐nitrosodimethylamine, and methyleugenol). These examples illustrate that the uncertainty in a cancer risk estimate may be huge, making single value estimates of cancer risk meaningless. Further, a risk based on linear extrapolation tends to be lower than the upper 95% confidence limit of a probabilistic risk estimate, and in that sense it is not conservative. Our conceptual analysis showed that there are two possible basic approaches for cancer risk assessment, depending on the interpretation of the dose‐incidence data measured in animals. However, it remains unclear which of the two interpretations is the more adequate one, adding an additional uncertainty to the already huge confidence intervals for cancer risk estimates.  相似文献   

11.
The benchmark dose (BMD) approach has gained acceptance as a valuable risk assessment tool, but risk assessors still face significant challenges associated with selecting an appropriate BMD/BMDL estimate from the results of a set of acceptable dose‐response models. Current approaches do not explicitly address model uncertainty, and there is an existing need to more fully inform health risk assessors in this regard. In this study, a Bayesian model averaging (BMA) BMD estimation method taking model uncertainty into account is proposed as an alternative to current BMD estimation approaches for continuous data. Using the “hybrid” method proposed by Crump, two strategies of BMA, including both “maximum likelihood estimation based” and “Markov Chain Monte Carlo based” methods, are first applied as a demonstration to calculate model averaged BMD estimates from real continuous dose‐response data. The outcomes from the example data sets examined suggest that the BMA BMD estimates have higher reliability than the estimates from the individual models with highest posterior weight in terms of higher BMDL and smaller 90th percentile intervals. In addition, a simulation study is performed to evaluate the accuracy of the BMA BMD estimator. The results from the simulation study recommend that the BMA BMD estimates have smaller bias than the BMDs selected using other criteria. To further validate the BMA method, some technical issues, including the selection of models and the use of bootstrap methods for BMDL derivation, need further investigation over a more extensive, representative set of dose‐response data.  相似文献   

12.
There is increasing concern over deep uncertainty in the risk analysis field as probabilistic models of uncertainty cannot always be confidently determined or agreed upon for many of our most pressing contemporary risk challenges. This is particularly true in the climate change adaptation field, and has prompted the development of a number of frameworks aiming to characterize system vulnerabilities and identify robust alternatives. One such methodology is robust decision making (RDM), which uses simulation models to assess how strategies perform over many plausible conditions and then identifies and characterizes those where the strategy fails in a process termed scenario discovery. While many of the problems to which RDM has been applied are characterized by multiple objectives, research to date has provided little insight into how treatment of multiple criteria impacts the failure scenarios identified. In this research, we compare different methods for incorporating multiple objectives into the scenario discovery process to evaluate how they impact the resulting failure scenarios. We use the Lake Tana basin in Ethiopia as a case study, where climatic and environmental uncertainties could impact multiple planned water infrastructure projects, and find that failure scenarios may vary depending on the method used to aggregate multiple criteria. Common methods used to convert multiple attributes into a single utility score can obscure connections between failure scenarios and system performance, limiting the information provided to support decision making. Applying scenario discovery over each performance metric separately provides more nuanced information regarding the relative sensitivity of the objectives to different uncertain parameters, leading to clearer insights on measures that could be taken to improve system robustness and areas where additional research might prove useful.  相似文献   

13.
The methods currently used to evaluate the risk of developmental defects in humans from exposure to potential toxic agents do not reflect biological processes in extrapolating estimated risks to low doses and from test species to humans. We develop a mathematical model to describe aspects of the dynamic process of organogenesis, based on branching process models of cell kinetics. The biological information that can be incorporated into the model includes timing and rates of dynamic cell processes such as differentiation, migration, growth, and replication. The dose-response models produced can explain patterns of malformation rates as a function of both dose and time of exposure, resulting in improvements in risk assessment and understanding of the underlying mechanistic processes. To illustrate the use of the model, we apply it to the prediction of the effects of methylmercury on brain development in rats.  相似文献   

14.
Ethylene oxide is a gas produced in large quantities in the United States that is used primarily as a chemical intermediate in the production of ethylene glycol, propylene glycol, non-ionic surfactants, ethanolamines, glycol ethers, and other chemicals. It has been well established that ethylene oxide can induce cancer, genetic, reproductive and developmental, and acute health effects in animals. The U.S. Environmental Protection Agency is currently developing both a cancer potency factor and a reference concentration (RfC) for ethylene oxide. This study used the rich database on the reproductive and developmental effects of ethylene oxide to develop a probabilistic characterization of possible regulatory thresholds for ethylene oxide. This analysis was based on the standard regulatory approach for noncancer risk assessment, but involved several innovative elements, such as: (1) the use of advanced statistical methods to account for correlations in developmental outcomes among littermates and allow for simultaneous control of covariates (such as litter size); (2) the application of a probabilistic approach for characterizing the uncertainty in extrapolating the animal results to humans; and (3) the use of a quantitative approach to account for the variation in heterogeneity among the human population. This article presents several classes of results, including: (1) probabilistic characterizations of ED10s for two quantal reproductive outcomes-resorption and fetal death, (2) probabilistic characterizations of one developmental outcome-the dose expected to yield a 5% reduction in fetal (or pup) weight, (3) estimates of the RfCs that would result from using these values in the standard regulatory approach for noncancer risk assessment, and (4) a probabilistic characterization of the level of ethylene oxide exposure that would be expected to yield a 1/1,000 increase in the risk of reproductive or developmental outcomes in exposed human populations.  相似文献   

15.
Methods for Uncertainty Analysis: A Comparative Survey   总被引:1,自引:0,他引:1  
This paper presents a survey and comparative evaluation of methods which have been developed for the determination of uncertainties in accident consequences and probabilities, for use in probabilistic risk assessment. The methods considered are: analytic techniques, Monte Carlo simulation, response surface approaches, differential sensitivity techniques, and evaluation of classical statistical confidence bounds. It is concluded that only the response surface and differential sensitivity approaches are sufficiently general and flexible for use as overall methods of uncertainty analysis in probabilistic risk assessment. The other methods considered, however, are very useful in particular problems.  相似文献   

16.
The increased frequency of extreme events in recent years highlights the emerging need for the development of methods that could contribute to the mitigation of the impact of such events on critical infrastructures, as well as boost their resilience against them. This article proposes an online spatial risk analysis capable of providing an indication of the evolving risk of power systems regions subject to extreme events. A Severity Risk Index (SRI) with the support of real‐time monitoring assesses the impact of the extreme events on the power system resilience, with application to the effect of windstorms on transmission networks. The index considers the spatial and temporal evolution of the extreme event, system operating conditions, and the degraded system performance during the event. SRI is based on probabilistic risk by condensing the probability and impact of possible failure scenarios while the event is spatially moving across a power system. Due to the large number of possible failures during an extreme event, a scenario generation and reduction algorithm is applied in order to reduce the computation time. SRI provides the operator with a probabilistic assessment that could lead to effective resilience‐based decisions for risk mitigation. The IEEE 24‐bus Reliability Test System has been used to demonstrate the effectiveness of the proposed online risk analysis, which was embedded in a sequential Monte Carlo simulation for capturing the spatiotemporal effects of extreme events and evaluating the effectiveness of the proposed method.  相似文献   

17.
U.S. Environment Protection Agency benchmark doses for dichotomous cancer responses are often estimated using a multistage model based on a monotonic dose‐response assumption. To account for model uncertainty in the estimation process, several model averaging methods have been proposed for risk assessment. In this article, we extend the usual parameter space in the multistage model for monotonicity to allow for the possibility of a hormetic dose‐response relationship. Bayesian model averaging is used to estimate the benchmark dose and to provide posterior probabilities for monotonicity versus hormesis. Simulation studies show that the newly proposed method provides robust point and interval estimation of a benchmark dose in the presence or absence of hormesis. We also apply the method to two data sets on carcinogenic response of rats to 2,3,7,8‐tetrachlorodibenzo‐p‐dioxin.  相似文献   

18.
The treatment of uncertainties associated with modeling and risk assessment has recently attracted significant attention. The methodology and guidance for dealing with parameter uncertainty have been fairly well developed and quantitative tools such as Monte Carlo modeling are often recommended. However, the issue of model uncertainty is still rarely addressed in practical applications of risk assessment. The use of several alternative models to derive a range of model outputs or risks is one of a few available techniques. This article addresses the often-overlooked issue of what we call "modeler uncertainty," i.e., difference in problem formulation, model implementation, and parameter selection originating from subjective interpretation of the problem at hand. This study uses results from the Fruit Working Group, which was created under the International Atomic Energy Agency (IAEA) BIOMASS program (BIOsphere Modeling and ASSessment). Model-model and model-data intercomparisons reviewed in this study were conducted by the working group for a total of three different scenarios. The greatest uncertainty was found to result from modelers' interpretation of scenarios and approximations made by modelers. In scenarios that were unclear for modelers, the initial differences in model predictions were as high as seven orders of magnitude. Only after several meetings and discussions about specific assumptions did the differences in predictions by various models merge. Our study shows that parameter uncertainty (as evaluated by a probabilistic Monte Carlo assessment) may have contributed over one order of magnitude to the overall modeling uncertainty. The final model predictions ranged between one and three orders of magnitude, depending on the specific scenario. This study illustrates the importance of problem formulation and implementation of an analytic-deliberative process in risk characterization.  相似文献   

19.
As part of the launch approval process, the Interagency Nuclear Safety Review Panel provides an independent safety assessment of space missions--such as the Cassini mission--that carry a significant amount of nuclear materials. This survey article describes potential accident scenarios that might lead to release of fuel from an accidental reentry during an Earth swingby maneuver, the probabilities of such scenarios, and their consequences. To illustrate the nature of calculations used in this area, examples are presented of probabilistic models to obtain both the probability of scenario events and the resultant source terms of such scenarios. Because of large extrapolations from the current knowledge base, the analysis emphasizes treatment of uncertainties.  相似文献   

20.
Quantifying Flood Risks in the Netherlands   总被引:1,自引:0,他引:1       下载免费PDF全文
The Flood Risk in the Netherlands project (Dutch acronym: VNK2) is a large‐scale probabilistic risk assessment for all major levee systems in the Netherlands. This article provides an overview of the methods and techniques used in the VNK2 project. It also discusses two examples that illustrate the potential of quantitative flood risk assessments such as VNK2 to improve flood risk management processes: (i) informing political debates about the risks of flooding and the effectiveness of risk management actions, and (ii) (re)directing research efforts towards important sources of uncertainty.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号