首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
《Risk analysis》2016,36(2):191-202
We live in an age that increasingly calls for national or regional management of global risks. This article discusses the contributions that expert elicitation can bring to efforts to manage global risks and identifies challenges faced in conducting expert elicitation at this scale. In doing so it draws on lessons learned from conducting an expert elicitation as part of the World Health Organizations (WHO) initiative to estimate the global burden of foodborne disease; a study commissioned by the Foodborne Disease Epidemiology Reference Group (FERG). Expert elicitation is designed to fill gaps in data and research using structured, transparent methods. Such gaps are a significant challenge for global risk modeling. Experience with the WHO FERG expert elicitation shows that it is feasible to conduct an expert elicitation at a global scale, but that challenges do arise, including: defining an informative, yet feasible geographical structure for the elicitation; defining what constitutes expertise in a global setting; structuring international, multidisciplinary expert panels; and managing demands on experts’ time in the elicitation. This article was written as part of a workshop, “Methods for Research Synthesis: A Cross‐Disciplinary Approach” held at the Harvard Center for Risk Analysis on October 13, 2013.  相似文献   

2.
Reducing the Risk of Human Extinction   总被引:2,自引:1,他引:1  
In this century a number of events could extinguish humanity. The probability of these events may be very low, but the expected value of preventing them could be high, as it represents the value of all future human lives. We review the challenges to studying human extinction risks and, by way of example, estimate the cost effectiveness of preventing extinction-level asteroid impacts.  相似文献   

3.
Typically, full Bayesian estimation of correlated event rates can be computationally challenging since estimators are intractable. When estimation of event rates represents one activity within a larger modeling process, there is an incentive to develop more efficient inference than provided by a full Bayesian model. We develop a new subjective inference method for correlated event rates based on a Bayes linear Bayes model under the assumption that events are generated from a homogeneous Poisson process. To reduce the elicitation burden we introduce homogenization factors to the model and, as an alternative to a subjective prior, an empirical method using the method of moments is developed. Inference under the new method is compared against estimates obtained under a full Bayesian model, which takes a multivariate gamma prior, where the predictive and posterior distributions are derived in terms of well‐known functions. The mathematical properties of both models are presented. A simulation study shows that the Bayes linear Bayes inference method and the full Bayesian model provide equally reliable estimates. An illustrative example, motivated by a problem of estimating correlated event rates across different users in a simple supply chain, shows how ignoring the correlation leads to biased estimation of event rates.  相似文献   

4.
This article proposes, develops, and illustrates the application of level‐k game theory to adversarial risk analysis. Level‐k reasoning, which assumes that players play strategically but have bounded rationality, is useful for operationalizing a Bayesian approach to adversarial risk analysis. It can be applied in a broad class of settings, including settings with asynchronous play and partial but incomplete revelation of early moves. Its computational and elicitation requirements are modest. We illustrate the approach with an application to a simple defend‐attack model in which the defender's countermeasures are revealed with a probability less than one to the attacker before he decides on how or whether to attack.  相似文献   

5.
Standard statistical methods understate the uncertainty one should attach to effect estimates obtained from observational data. Among the methods used to address this problem are sensitivity analysis, Monte Carlo risk analysis (MCRA), and Bayesian uncertainty assessment. Estimates from MCRAs have been presented as if they were valid frequentist or Bayesian results, but examples show that they need not be either in actual applications. It is concluded that both sensitivity analyses and MCRA should begin with the same type of prior specification effort as Bayesian analysis.  相似文献   

6.
Probability elicitation protocols are used to assess and incorporate subjective probabilities in risk and decision analysis. While most of these protocols use methods that have focused on the precision of the elicited probabilities, the speed of the elicitation process has often been neglected. However, speed is also important, particularly when experts need to examine a large number of events on a recurrent basis. Furthermore, most existing elicitation methods are numerical in nature, but there are various reasons why an expert would refuse to give such precise ratio‐scale estimates, even if highly numerate. This may occur, for instance, when there is lack of sufficient hard evidence, when assessing very uncertain events (such as emergent threats), or when dealing with politicized topics (such as terrorism or disease outbreaks). In this article, we adopt an ordinal ranking approach from multicriteria decision analysis to provide a fast and nonnumerical probability elicitation process. Probabilities are subsequently approximated from the ranking by an algorithm based on the principle of maximum entropy, a rule compatible with the ordinal information provided by the expert. The method can elicit probabilities for a wide range of different event types, including new ways of eliciting probabilities for stochastically independent events and low‐probability events. We use a Monte Carlo simulation to test the accuracy of the approximated probabilities and try the method in practice, applying it to a real‐world risk analysis recently conducted for DEFRA (the U.K. Department for the Environment, Farming and Rural Affairs): the prioritization of animal health threats.  相似文献   

7.
Bayesian methods are presented for updating the uncertainty in the predictions of an integrated Environmental Health Risk Assessment (EHRA) model. The methods allow the estimation of posterior uncertainty distributions based on the observation of different model outputs along the chain of the linked assessment framework. Analytical equations are derived for the case of the multiplicative lognormal risk model where the sequential log outputs (log ambient concentration, log applied dose, log delivered dose, and log risk) are each normally distributed. Given observations of a log output made with a normally distributed measurement error, the posterior distributions of the log outputs remain normal, but with modified means and variances, and induced correlations between successive log outputs and log inputs. The analytical equations for forward and backward propagation of the updates are generally applicable to sums of normally distributed variables. The Bayesian Monte-Carlo (BMC) procedure is presented to provide an approximate, but more broadly applicable method for numerically updating uncertainty with concurrent backward and forward propagation. Illustrative examples, presented for the multiplicative lognormal model, demonstrate agreement between the analytical and BMC methods, and show how uncertainty updates can propagate through a linked EHRA. The Bayesian updating methods facilitate the pooling of knowledge encoded in predictive models with that transmitted by research outcomes (e.g., field measurements), and thereby support the practice of iterative risk assessment and value of information appraisals.  相似文献   

8.
Regulatory agencies often perform microbial risk assessments to evaluate the change in the number of human illnesses as the result of a new policy that reduces the level of contamination in the food supply. These agencies generally have regulatory authority over the production and retail sectors of the farm‐to‐table continuum. Any predicted change in contamination that results from new policy that regulates production practices occurs many steps prior to consumption of the product. This study proposes a framework for conducting microbial food‐safety risk assessments; this framework can be used to quantitatively assess the annual effects of national regulatory policies. Advantages of the framework are that estimates of human illnesses are consistent with national disease surveillance data (which are usually summarized on an annual basis) and some of the modeling steps that occur between production and consumption can be collapsed or eliminated. The framework leads to probabilistic models that include uncertainty and variability in critical input parameters; these models can be solved using a number of different Bayesian methods. The Bayesian synthesis method performs well for this application and generates posterior distributions of parameters that are relevant to assessing the effect of implementing a new policy. An example, based on Campylobacter and chicken, estimates the annual number of illnesses avoided by a hypothetical policy; this output could be used to assess the economic benefits of a new policy. Empirical validation of the policy effect is also examined by estimating the annual change in the numbers of illnesses observed via disease surveillance systems.  相似文献   

9.
Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent‐based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss‐reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low‐probability/high‐impact risks.  相似文献   

10.
Reassessing Benzene Cancer Risks Using Internal Doses   总被引:1,自引:0,他引:1  
Human cancer risks from benzene exposure have previously been estimated by regulatory agencies based primarily on epidemiological data, with supporting evidence provided by animal bioassay data. This paper reexamines the animal-based risk assessments for benzene using physiologically-based pharmacokinetic (PBPK) models of benzene metabolism in animals and humans. It demonstrates that internal doses (interpreted as total benzene metabolites formed) from oral gavage experiments in mice are well predicted by a PBPK model developed by Travis et al. Both the data and the model outputs can also be accurately described by the simple nonlinear regression model total metabolites = 76.4x/(80.75 + x), where x = administered dose in mg/kg/day. Thus, PBPK modeling validates the use of such nonlinear regression models, previously used by Bailer and Hoel. An important finding is that refitting the linearized multistage (LMS) model family to internal doses and observed responses changes the maximum-likelihood estimate (MLE) dose-response curve for mice from linear-quadratic to cubic, leading to low-dose risk estimates smaller than in previous risk assessments. This is consistent with the conclusion for mice from the Bailer and Hoel analysis. An innovation in this paper is estimation of internal doses for humans based on a PBPK model (and the regression model approximating it) rather than on interspecies dose conversions. Estimates of human risks at low doses are reduced by the use of internal dose estimates when the estimates are obtained from a PBPK model, in contrast to Bailer and Hoel's findings based on interspecies dose conversion. Sensitivity analyses and comparisons with epidemiological data and risk models suggest that our finding of a nonlinear MLE dose-response curve at low doses is robust to changes in assumptions and more consistent with epidemiological data than earlier risk models.  相似文献   

11.
A novel approach to the quantitative assessment of food-borne risks is proposed. The basic idea is to use Bayesian techniques in two distinct steps: first by constructing a stochastic core model via a Bayesian network based on expert knowledge, and second, using the data available to improve this knowledge. Unlike the Monte Carlo simulation approach as commonly used in quantitative assessment of food-borne risks where data sets are used independently in each module, our consistent procedure incorporates information conveyed by data throughout the chain. It allows "back-calculation" in the food chain model, together with the use of data obtained "downstream" in the food chain. Moreover, the expert knowledge is introduced more simply and consistently than with classical statistical methods. Other advantages of this approach include the clear framework of an iterative learning process, considerable flexibility enabling the use of heterogeneous data, and a justified method to explore the effects of variability and uncertainty. As an illustration, we present an estimation of the probability of contracting a campylobacteriosis as a result of broiler contamination, from the standpoint of quantitative risk assessment. Although the model thus constructed is oversimplified, it clarifies the principles and properties of the method proposed, which demonstrates its ability to deal with quite complex situations and provides a useful basis for further discussions with different experts in the food chain.  相似文献   

12.
This paper addresses the use of data for identifying and characterizing uncertainties in model parameters and predictions. The Bayesian Monte Carlo method is formally presented and elaborated, and applied to the analysis of the uncertainty in a predictive model for global mean sea level change. The method uses observations of output variables, made with an assumed error structure, to determine a posterior distribution of model outputs. This is used to derive a posterior distribution for the model parameters. Results demonstrate the resolution of the uncertainty that is obtained as a result of the Bayesian analysis and also indicate the key contributors to the uncertainty in the sea level rise model. While the technique is illustrated with a simple, preliminary model, the analysis provides an iterative framework for model refinement. The methodology developed in this paper provides a mechanism for the incorporation of ongoing data collection and research in decision-making for problems involving uncertain environmental change.  相似文献   

13.
Yacov Y Haimes 《Risk analysis》2012,32(11):1834-1845
Natural and human‐induced disasters affect organizations in myriad ways because of the inherent interconnectedness and interdependencies among human, cyber, and physical infrastructures, but more importantly, because organizations depend on the effectiveness of people and on the leadership they provide to the organizations they serve and represent. These human–organizational–cyber–physical infrastructure entities are termed systems of systems. Given the multiple perspectives that characterize them, they cannot be modeled effectively with a single model. The focus of this article is: (i) the centrality of the states of a system in modeling; (ii) the efficacious role of shared states in modeling systems of systems, in identification, and in the meta‐modeling of systems of systems; and (iii) the contributions of the above to strategic preparedness, response to, and recovery from catastrophic risk to such systems. Strategic preparedness connotes a decision‐making process and its associated actions. These must be: implemented in advance of a natural or human‐induced disaster, aimed at reducing consequences (e.g., recovery time, community suffering, and cost), and/or controlling their likelihood to a level considered acceptable (through the decisionmakers’ implicit and explicit acceptance of various risks and tradeoffs). The inoperability input‐output model (IIM), which is grounded on Leontief's input/output model, has enabled the modeling of interdependent subsystems. Two separate modeling structures are introduced. These are: phantom system models (PSM), where shared states constitute the essence of modeling coupled systems; and the IIM, where interdependencies among sectors of the economy are manifested by the Leontief matrix of technological coefficients. This article demonstrates the potential contributions of these two models to each other, and thus to more informative modeling of systems of systems schema. The contributions of shared states to this modeling and to systems identification are presented with case studies.  相似文献   

14.
Risk‐related knowledge gained from past construction projects is regarded as potentially extremely useful in risk management. This article describes a proposed approach to capture and integrate risk‐related knowledge to support decision making in construction projects. To ameliorate the problem related to the scarcity of risks information often encountered in construction projects, Bayesian Belief Networks are used and expert judgment is elicited to augment available information. Particularly, the article provides an overview of judgment‐based biases that can appear in the elicitation of judgments for constructing Bayesian Networks and the provisos that can be made in this respect to minimize these types of bias. The proposed approach is successfully applied to develop six models for top risks in tunnel works. More than 30 tunneling experts in the Netherlands and Germany were involved in the investigation to provide information on identifying relevant scenarios than can lead to failure events associated with tunneling risks. The article has provided an illustration of the applicability of the developed approach for the case of “face instability in soft soils using slurry shields.”  相似文献   

15.
Estimating potential health risks associated with recycled (reused) water is highly complex given the multiple factors affecting water quality. We take a conceptual model, which represents the factors and pathways by which recycled water may pose a risk of contracting gastroenteritis, convert the conceptual model to a Bayesian net, and quantify the model using one expert's opinion. This allows us to make various predictions as to the risks posed under various scenarios. Bayesian nets provide an additional way of modeling the determinants of recycled water quality and elucidating their relative influence on a given disease outcome. The important contribution to Bayesian net methodology is that all model predictions, whether risk or relative risk estimates, are expressed as credible intervals.  相似文献   

16.
Louis Anthony Cox  Jr 《Risk analysis》2008,28(6):1749-1761
Several important risk analysis methods now used in setting priorities for protecting U.S. infrastructures against terrorist attacks are based on the formula: Risk=Threat×Vulnerability×Consequence. This article identifies potential limitations in such methods that can undermine their ability to guide resource allocations to effectively optimize risk reductions. After considering specific examples for the Risk Analysis and Management for Critical Asset Protection (RAMCAP?) framework used by the Department of Homeland Security, we address more fundamental limitations of the product formula. These include its failure to adjust for correlations among its components, nonadditivity of risks estimated using the formula, inability to use risk‐scoring results to optimally allocate defensive resources, and intrinsic subjectivity and ambiguity of Threat, Vulnerability, and Consequence numbers. Trying to directly assess probabilities for the actions of intelligent antagonists instead of modeling how they adaptively pursue their goals in light of available information and experience can produce ambiguous or mistaken risk estimates. Recent work demonstrates that two‐level (or few‐level) hierarchical optimization models can provide a useful alternative to Risk=Threat×Vulnerability×Consequence scoring rules, and also to probabilistic risk assessment (PRA) techniques that ignore rational planning and adaptation. In such two‐level optimization models, defender predicts attacker's best response to defender's own actions, and then chooses his or her own actions taking into account these best responses. Such models appear valuable as practical approaches to antiterrorism risk analysis.  相似文献   

17.
Fault diagnosis includes the main task of classification. Bayesian networks (BNs) present several advantages in the classification task, and previous works have suggested their use as classifiers. Because a classifier is often only one part of a larger decision process, this article proposes, for industrial process diagnosis, the use of a Bayesian method called dynamic Markov blanket classifier that has as its main goal the induction of accurate Bayesian classifiers having dependable probability estimates and revealing actual relationships among the most relevant variables. In addition, a new method, named variable ordering multiple offspring sampling capable of inducing a BN to be used as a classifier, is presented. The performance of these methods is assessed on the data of a benchmark problem known as the Tennessee Eastman process. The obtained results are compared with naive Bayes and tree augmented network classifiers, and confirm that both proposed algorithms can provide good classification accuracies as well as knowledge about relevant variables.  相似文献   

18.
Analysis of competing hypothesis, a method for evaluating explanations of observed evidence, is used in numerous fields, including counterterrorism, psychology, and intelligence analysis. We propose a Bayesian extension of the methodology, posing the problem in terms of a multinomial‐Dirichlet hierarchical model. The yet‐to‐be observed true hypothesis is regarded as a multinomial random variable and the evaluation of the evidence is treated as a structured elicitation of a prior distribution on the probabilities of the hypotheses. This model provides the user with measures of uncertainty for the probabilities of the hypotheses. We discuss inference, such as point and interval estimates of hypothesis probabilities, ratios of hypothesis probabilities, and Bayes factors. A simple example involving the stadium relocation of the San Diego Chargers is used to illustrate the method. We also present several extensions of the model that enable it to handle special types of evidence, including evidence that is irrelevant to one or more hypotheses, evidence against hypotheses, and evidence that is subject to deception.  相似文献   

19.
Ali Mosleh 《Risk analysis》2012,32(11):1888-1900
Credit risk is the potential exposure of a creditor to an obligor's failure or refusal to repay the debt in principal or interest. The potential of exposure is measured in terms of probability of default. Many models have been developed to estimate credit risk, with rating agencies dating back to the 19th century. They provide their assessment of probability of default and transition probabilities of various firms in their annual reports. Regulatory capital requirements for credit risk outlined by the Basel Committee on Banking Supervision have made it essential for banks and financial institutions to develop sophisticated models in an attempt to measure credit risk with higher accuracy. The Bayesian framework proposed in this article uses the techniques developed in physical sciences and engineering for dealing with model uncertainty and expert accuracy to obtain improved estimates of credit risk and associated uncertainties. The approach uses estimates from one or more rating agencies and incorporates their historical accuracy (past performance data) in estimating future default risk and transition probabilities. Several examples demonstrate that the proposed methodology can assess default probability with accuracy exceeding the estimations of all the individual models. Moreover, the methodology accounts for potentially significant departures from “nominal predictions” due to “upsetting events” such as the 2008 global banking crisis.  相似文献   

20.
To quantify the health benefits of environmental policies, economists generally require estimates of the reduced probability of illness or death. For policies that reduce exposure to carcinogenic substances, these estimates traditionally have been obtained through the linear extrapolation of experimental dose-response data to low-exposure scenarios as described in the U.S. Environmental Protection Agency's Guidelines for Carcinogen Risk Assessment (1986). In response to evolving scientific knowledge, EPA proposed revisions to the guidelines in 1996. Under the proposed revisions, dose-response relationships would not be estimated for carcinogens thought to exhibit nonlinear modes of action. Such a change in cancer-risk assessment methods and outputs will likely have serious consequences for how benefit-cost analyses of policies aimed at reducing cancer risks are conducted. Any tendency for reduced quantification of effects in environmental risk assessments, such as those contemplated in the revisions to EPA's cancer-risk assessment guidelines, impedes the ability of economic analysts to respond to increasing calls for benefit-cost analysis. This article examines the implications for benefit-cost analysis of carcinogenic exposures of the proposed changes to the 1986 Guidelines and proposes an approach for bounding dose-response relationships when no biologically based models are available. In spite of the more limited quantitative information provided in a carcinogen risk assessment under the proposed revisions to the guidelines, we argue that reasonable bounds on dose-response relationships can be estimated for low-level exposures to nonlinear carcinogens. This approach yields estimates of reduced illness for use in a benefit-cost analysis while incorporating evidence of nonlinearities in the dose-response relationship. As an illustration, the bounding approach is applied to the case of chloroform exposure.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号