首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
《Risk analysis》2018,38(8):1585-1600
Historical data analysis shows that escalation accidents, so‐called domino effects, have an important role in disastrous accidents in the chemical and process industries. In this study, an agent‐based modeling and simulation approach is proposed to study the propagation of domino effects in the chemical and process industries. Different from the analytical or Monte Carlo simulation approaches, which normally study the domino effect at probabilistic network levels, the agent‐based modeling technique explains the domino effects from a bottom‐up perspective. In this approach, the installations involved in a domino effect are modeled as agents whereas the interactions among the installations (e.g., by means of heat radiation) are modeled via the basic rules of the agents. Application of the developed model to several case studies demonstrates the ability of the model not only in modeling higher‐level domino effects and synergistic effects but also in accounting for temporal dependencies. The model can readily be applied to large‐scale complicated cases.  相似文献   

2.
This article presents an asset‐level security risk management framework to assist stakeholders of critical assets with allocating limited budgets for enhancing their safety and security against terrorist attack. The proposed framework models the security system of an asset, considers various threat scenarios, and models the sequential decision framework of attackers during the attack. Its novel contributions are the introduction of the notion of partial neutralization of attackers by defenders, estimation of total loss from successful, partially successful, and unsuccessful actions of attackers at various stages of an attack, and inclusion of the effects of these losses on the choices made by terrorists at various stages of the attack. The application of the proposed method is demonstrated in an example dealing with security risk management of a U.S. commercial airport, in which a set of plausible threat scenarios and risk mitigation options are considered. It is found that a combination of providing blast‐resistant cargo containers and a video surveillance system on the airport perimeter fence is the best option based on minimum expected life‐cycle cost considering a 10‐year service period.  相似文献   

3.
The Monte Carlo (MC) simulation approach is traditionally used in food safety risk assessment to study quantitative microbial risk assessment (QMRA) models. When experimental data are available, performing Bayesian inference is a good alternative approach that allows backward calculation in a stochastic QMRA model to update the experts’ knowledge about the microbial dynamics of a given food‐borne pathogen. In this article, we propose a complex example where Bayesian inference is applied to a high‐dimensional second‐order QMRA model. The case study is a farm‐to‐fork QMRA model considering genetic diversity of Bacillus cereus in a cooked, pasteurized, and chilled courgette purée. Experimental data are Bacillus cereus concentrations measured in packages of courgette purées stored at different time‐temperature profiles after pasteurization. To perform a Bayesian inference, we first built an augmented Bayesian network by linking a second‐order QMRA model to the available contamination data. We then ran a Markov chain Monte Carlo (MCMC) algorithm to update all the unknown concentrations and unknown quantities of the augmented model. About 25% of the prior beliefs are strongly updated, leading to a reduction in uncertainty. Some updates interestingly question the QMRA model.  相似文献   

4.
We analyze the issue of agency costs in aviation security by combining results from a quantitative economic model with a qualitative study based on semi‐structured interviews. Our model extends previous principal‐agent models by combining the traditional fixed and varying monetary responses to physical and cognitive effort with nonmonetary welfare and potentially transferable value of employees' own human capital. To provide empirical evidence for the tradeoffs identified in the quantitative model, we have undertaken an extensive interview process with regulators, airport managers, security personnel, and those tasked with training security personnel from an airport operating in a relatively high‐risk state, Turkey. Our results indicate that the effectiveness of additional training depends on the mix of “transferable skills” and “emotional” buy‐in of the security agents. Principals need to identify on which side of a critical tipping point their agents are to ensure that additional training, with attached expectations of the burden of work, aligns the incentives of employees with the principals' own objectives.  相似文献   

5.
Risk Analysis for Critical Asset Protection   总被引:2,自引:0,他引:2  
This article proposes a quantitative risk assessment and management framework that supports strategic asset-level resource allocation decision making for critical infrastructure and key resource protection. The proposed framework consists of five phases: scenario identification, consequence and criticality assessment, security vulnerability assessment, threat likelihood assessment, and benefit-cost analysis. Key innovations in this methodology include its initial focus on fundamental asset characteristics to generate an exhaustive set of plausible threat scenarios based on a target susceptibility matrix (which we refer to as asset-driven analysis) and an approach to threat likelihood assessment that captures adversary tendencies to shift their preferences in response to security investments based on the expected utilities of alternative attack profiles assessed from the adversary perspective. A notional example is provided to demonstrate an application of the proposed framework. Extensions of this model to support strategic portfolio-level analysis and tactical risk analysis are suggested.  相似文献   

6.
The treatment of uncertainties associated with modeling and risk assessment has recently attracted significant attention. The methodology and guidance for dealing with parameter uncertainty have been fairly well developed and quantitative tools such as Monte Carlo modeling are often recommended. However, the issue of model uncertainty is still rarely addressed in practical applications of risk assessment. The use of several alternative models to derive a range of model outputs or risks is one of a few available techniques. This article addresses the often-overlooked issue of what we call "modeler uncertainty," i.e., difference in problem formulation, model implementation, and parameter selection originating from subjective interpretation of the problem at hand. This study uses results from the Fruit Working Group, which was created under the International Atomic Energy Agency (IAEA) BIOMASS program (BIOsphere Modeling and ASSessment). Model-model and model-data intercomparisons reviewed in this study were conducted by the working group for a total of three different scenarios. The greatest uncertainty was found to result from modelers' interpretation of scenarios and approximations made by modelers. In scenarios that were unclear for modelers, the initial differences in model predictions were as high as seven orders of magnitude. Only after several meetings and discussions about specific assumptions did the differences in predictions by various models merge. Our study shows that parameter uncertainty (as evaluated by a probabilistic Monte Carlo assessment) may have contributed over one order of magnitude to the overall modeling uncertainty. The final model predictions ranged between one and three orders of magnitude, depending on the specific scenario. This study illustrates the importance of problem formulation and implementation of an analytic-deliberative process in risk characterization.  相似文献   

7.
In November 2001, the Monterey Institute of International Studies convened a workshop on bioterrorism threat assessment and risk management. Risk assessment practitioners from various disciplines, but without specialized knowledge of terrorism, were brought together with security and intelligence threat analysts to stimulate an exchange that could be useful to both communities. This article, prepared by a subset of the participants, comments on the workshop's findings and their implications and makes three recommendations, two short term (use of threat assessment methodologies and vulnerability analysis) and one long term (application of quantitative risk assessment and modeling), regarding the practical application of risk assessment methods to bioterrorism issues.  相似文献   

8.
Topics in Microbial Risk Assessment: Dynamic Flow Tree Process   总被引:5,自引:0,他引:5  
Microbial risk assessment is emerging as a new discipline in risk assessment. A systematic approach to microbial risk assessment is presented that employs data analysis for developing parsimonious models and accounts formally for the variability and uncertainty of model inputs using analysis of variance and Monte Carlo simulation. The purpose of the paper is to raise and examine issues in conducting microbial risk assessments. The enteric pathogen Escherichia coli O157:H7 was selected as an example for this study due to its significance to public health. The framework for our work is consistent with the risk assessment components described by the National Research Council in 1983 (hazard identification; exposure assessment; dose-response assessment; and risk characterization). Exposure assessment focuses on hamburgers, cooked a range of temperatures from rare to well done, the latter typical for fast food restaurants. Features of the model include predictive microbiology components that account for random stochastic growth and death of organisms in hamburger. For dose-response modeling, Shigella data from human feeding studies were used as a surrogate for E. coli O157:H7. Risks were calculated using a threshold model and an alternative nonthreshold model. The 95% probability intervals for risk of illness for product cooked to a given internal temperature spanned five orders of magnitude for these models. The existence of even a small threshold has a dramatic impact on the estimated risk.  相似文献   

9.
Abstract

Supply chain design is a complex and relatively poorly structured process, involving choosing many decisional parameters and it usually requires consideration of numerous sources of uncertainty. Many conventional processes of supply chain design involve taking a deterministic approach, using point estimates, on important measures of supply chain effectiveness such as cost, quality, delivery reliability and service levels. Supply chain disruptions are often separately considered as risks, both in the research literature and in practice, meaning that a purely traditional risk management and risk minimization approach is taken. We have developed and applied an approach that combines the intellect and experience of the supply chain designer with the power of evaluation provided by a Monte Carlo simulation model, which uses decision analysis techniques to explicitly incorporate the full spectrum of uncertain quantities across the set of alternative supply chain designs being considered. After defining and setting out the general decision variables and uncertainty factors for 16 distinct supply chain design decision categories, we then apply that approach to combine the decision-makers’ heuristics with the probabilistic modeling approach, iteratively, to achieve the best of both elements of such an approach. This novel approach to fully integrating performance and risk elements of supply chain designs is then illustrated with a case study. Finally, we call for further developmental research and field work to refine this approach.  相似文献   

10.
According to Codex Alimentarius Commission recommendations, management options applied at the process production level should be based on good hygiene practices, HACCP system, and new risk management metrics such as the food safety objective. To follow this last recommendation, the use of quantitative microbiological risk assessment is an appealing approach to link new risk‐based metrics to management options that may be applied by food operators. Through a specific case study, Listeria monocytogenes in soft cheese made from pasteurized milk, the objective of the present article is to practically show how quantitative risk assessment could be used to direct potential intervention strategies at different food processing steps. Based on many assumptions, the model developed estimates the risk of listeriosis at the moment of consumption taking into account the entire manufacturing process and potential sources of contamination. From pasteurization to consumption, the amplification of a primo‐contamination event of the milk, the fresh cheese or the process environment is simulated, over time, space, and between products, accounting for the impact of management options, such as hygienic operations and sampling plans. A sensitivity analysis of the model will help orientating data to be collected prioritarily for the improvement and the validation of the model. What‐if scenarios were simulated and allowed for the identification of major parameters contributing to the risk of listeriosis and the optimization of preventive and corrective measures.  相似文献   

11.
Multicriteria decision analysis (MCDA) has been applied to various energy problems to incorporate a variety of qualitative and quantitative criteria, usually spanning environmental, social, engineering, and economic fields. MCDA and associated methods such as life‐cycle assessments and cost‐benefit analysis can also include risk analysis to address uncertainties in criteria estimates. One technology now being assessed to help mitigate climate change is carbon capture and storage (CCS). CCS is a new process that captures CO2 emissions from fossil‐fueled power plants and injects them into geological reservoirs for storage. It presents a unique challenge to decisionmakers (DMs) due to its technical complexity, range of environmental, social, and economic impacts, variety of stakeholders, and long time spans. The authors have developed a risk assessment model using a MCDA approach for CCS decisions such as selecting between CO2 storage locations and choosing among different mitigation actions for reducing risks. The model includes uncertainty measures for several factors, utility curve representations of all variables, Monte Carlo simulation, and sensitivity analysis. This article uses a CCS scenario example to demonstrate the development and application of the model based on data derived from published articles and publicly available sources. The model allows high‐level DMs to better understand project risks and the tradeoffs inherent in modern, complex energy decisions.  相似文献   

12.
《Risk analysis》2018,38(6):1258-1278
Although individual behavior plays a major role in community flood risk, traditional flood risk models generally do not capture information on how community policies and individual decisions impact the evolution of flood risk over time. The purpose of this study is to improve the understanding of the temporal aspects of flood risk through a combined analysis of the behavioral, engineering, and physical hazard aspects of flood risk. Additionally, the study aims to develop a new modeling approach for integrating behavior, policy, flood hazards, and engineering interventions. An agent‐based model (ABM) is used to analyze the influence of flood protection measures, individual behavior, and the occurrence of floods and near‐miss flood events on community flood risk. The ABM focuses on the following decisions and behaviors: dissemination of flood management information, installation of community flood protection, elevation of household mechanical equipment, and elevation of homes. The approach is place based, with a case study area in Fargo, North Dakota, but is focused on generalizable insights. Generally, community mitigation results in reduced future damage, and individual action, including mitigation and movement into and out of high‐risk areas, can have a significant influence on community flood risk. The results of this study provide useful insights into the interplay between individual and community actions and how it affects the evolution of flood risk. This study lends insight into priorities for future work, including the development of more in‐depth behavioral and decision rules at the individual and community level.  相似文献   

13.
This paper presents a method of estimating long-term exposures to point source emissions. The method consists of a Monte Carlo exposure model (PSEM or Point Source Exposure Model) that combines data on population mobility and mortality with information on daily activity patterns. The approach behind the model can be applied to a wide variety of exposure scenarios. In this paper, PSEM is used to characterize the range and distribution of lifetime equivalent doses received by inhalation of air contaminated by the emissions of a point source. The output of the model provides quantitative information on the dose, age, and gender of highly exposed individuals. The model is then used in an example risk assessment. Finally, future uses of the model's approach are discussed.  相似文献   

14.
The purpose of the paper is to demonstrate the usefulness of (1) system dynamics as a structural theory for operations management and (2) system dynamics models as content theories in operations management. The key findings are that, although feedback loops, accumulation processes, and delays exist and are widespread in operations management, often these phenomena are ignored completely or not considered appropriately. Hence, it is reasoned why system dynamics is well suited as an approach for many operations management studies, and it is shown how system dynamics theory can be used to explain, analyze, and understand such phenomena in operations management. The discussion is based on a literature review and on conceptual considerations, with examples of operations management studies based on system dynamics. Implications of using this theory include the necessary re‐framing of some operations management issues and the extension of empirical studies by dynamic modeling and simulation. The value of the paper lies in the conceptualization of the link between system dynamics and operations management, which is discussed on the level of theory.  相似文献   

15.
In the event of contamination of a water distribution system, decisions must be made to mitigate the impact of the contamination and to protect public health. Making threat management decisions while a contaminant spreads through the network is a dynamic and interactive process. Response actions taken by the utility managers and water consumption choices made by the consumers will affect the hydraulics, and thus the spread of the contaminant plume, in the network. A modeling framework that allows the simulation of a contamination event under the effects of actions taken by utility managers and consumers will be a useful tool for the analysis of alternative threat mitigation and management strategies. This article presents a multiagent modeling framework that combines agent‐based, mechanistic, and dynamic methods. Agents select actions based on a set of rules that represent an individual's autonomy, goal‐based desires, and reaction to the environment and the actions of other agents. Consumer behaviors including ingestion, mobility, reduction of water demands, and word‐of‐mouth communication are simulated. Management strategies are evaluated, including opening hydrants to flush the contaminant and broadcasts. As actions taken by consumer agents and utility operators affect demands and flows in the system, the mechanistic model is updated. Management strategies are evaluated based on the exposure of the population to the contaminant. The framework is designed to consider the typical issues involved in water distribution threat management and provides valuable analysis of threat containment strategies for water distribution system contamination events.  相似文献   

16.
This article presents ongoing research that focuses on efficient allocation of defense resources to minimize the damage inflicted on a spatially distributed physical network such as a pipeline, water system, or power distribution system from an attack by an active adversary, recognizing the fundamental difference between preparing for natural disasters such as hurricanes, earthquakes, or even accidental systems failures and the problem of allocating resources to defend against an opponent who is aware of, and anticipating, the defender's efforts to mitigate the threat. Our approach is to utilize a combination of integer programming and agent‐based modeling to allocate the defensive resources. We conceptualize the problem as a Stackelberg “leader follower” game where the defender first places his assets to defend key areas of the network, and the attacker then seeks to inflict the maximum damage possible within the constraints of resources and network structure. The criticality of arcs in the network is estimated by a deterministic network interdiction formulation, which then informs an evolutionary agent‐based simulation. The evolutionary agent‐based simulation is used to determine the allocation of resources for attackers and defenders that results in evolutionary stable strategies, where actions by either side alone cannot increase its share of victories. We demonstrate these techniques on an example network, comparing the evolutionary agent‐based results to a more traditional, probabilistic risk analysis (PRA) approach. Our results show that the agent‐based approach results in a greater percentage of defender victories than does the PRA‐based approach.  相似文献   

17.
Estimation of benchmark doses (BMDs) in quantitative risk assessment traditionally is based upon parametric dose‐response modeling. It is a well‐known concern, however, that if the chosen parametric model is uncertain and/or misspecified, inaccurate and possibly unsafe low‐dose inferences can result. We describe a nonparametric approach for estimating BMDs with quantal‐response data based on an isotonic regression method, and also study use of corresponding, nonparametric, bootstrap‐based confidence limits for the BMD. We explore the confidence limits’ small‐sample properties via a simulation study, and illustrate the calculations with an example from cancer risk assessment. It is seen that this nonparametric approach can provide a useful alternative for BMD estimation when faced with the problem of parametric model uncertainty.  相似文献   

18.
Methods to Approximate Joint Uncertainty and Variability in Risk   总被引:3,自引:0,他引:3  
As interest in quantitative analysis of joint uncertainty and interindividual variability (JUV) in risk grows, so does the need for related computational shortcuts. To quantify JUV in risk, Monte Carlo methods typically require nested sampling of JUV in distributed inputs, which is cumbersome and time-consuming. Two approximation methods proposed here allow simpler and more rapid analysis. The first consists of new upper-bound JUV estimators that involve only uncertainty or variability, not both, and so never require nested sampling to calculate. The second is a discrete-probability-calculus procedure that uses only the mean and one upper-tail mean for each input in order to estimate mean and upper-bound risk, which procedure is simpler and more intuitive than similar ones in use. Application of these methods is illustrated in an assessment of cancer risk from residential exposures to chloroform in Kanawah Valley, West Virginia. Because each of the multiple exposure pathways considered in this assessment had separate modeled sources of uncertainty and variability, the assessment illustrates a realistic case where a standard Monte Carlo approach to JUV analysis requires nested sampling. In the illustration, the first proposed method quantified JUV in cancer risk much more efficiently than corresponding nested Monte Carlo calculations. The second proposed method also nearly duplicated JUV-related and other estimates of risk obtained using Monte Carlo methods. Both methods were thus found adequate to obtain basic risk estimates accounting for JUV in a realistically complex risk assessment. These methods make routine JUV analysis more convenient and practical.  相似文献   

19.
As flood risks grow worldwide, a well‐designed insurance program engaging various stakeholders becomes a vital instrument in flood risk management. The main challenge concerns the applicability of standard approaches for calculating insurance premiums of rare catastrophic losses. This article focuses on the design of a flood‐loss‐sharing program involving private insurance based on location‐specific exposures. The analysis is guided by a developed integrated catastrophe risk management (ICRM) model consisting of a GIS‐based flood model and a stochastic optimization procedure with respect to location‐specific risk exposures. To achieve the stability and robustness of the program towards floods with various recurrences, the ICRM uses stochastic optimization procedure, which relies on quantile‐related risk functions of a systemic insolvency involving overpayments and underpayments of the stakeholders. Two alternative ways of calculating insurance premiums are compared: the robust derived with the ICRM and the traditional average annual loss approach. The applicability of the proposed model is illustrated in a case study of a Rotterdam area outside the main flood protection system in the Netherlands. Our numerical experiments demonstrate essential advantages of the robust premiums, namely, that they: (1) guarantee the program's solvency under all relevant flood scenarios rather than one average event; (2) establish a tradeoff between the security of the program and the welfare of locations; and (3) decrease the need for other risk transfer and risk reduction measures.  相似文献   

20.
Landfilling is a cost‐effective method, which makes it a widely used practice around the world, especially in developing countries. However, because of the improper management of landfills, high leachate leakage can have adverse impacts on soils, plants, groundwater, aquatic organisms, and, subsequently, human health. A comprehensive survey of the literature finds that the probabilistic quantification of uncertainty based on estimations of the human health risks due to landfill leachate contamination has rarely been reported. Hence, in the present study, the uncertainty about the human health risks from municipal solid waste landfill leachate contamination to children and adults was quantified to investigate its long‐term risks by using a Monte Carlo simulation framework for selected heavy metals. The Turbhe sanitary landfill of Navi Mumbai, India, which was commissioned in the recent past, was selected to understand the fate and transport of heavy metals in leachate. A large residential area is located near the site, which makes the risk assessment problem both crucial and challenging. In this article, an integral approach in the form of a framework has been proposed to quantify the uncertainty that is intrinsic to human health risk estimation. A set of nonparametric cubic splines was fitted to identify the nonlinear seasonal trend in leachate quality parameters. LandSim 2.5, a landfill simulator, was used to simulate the landfill activities for various time slices, and further uncertainty in noncarcinogenic human health risk was estimated using a Monte Carlo simulation followed by univariate and multivariate sensitivity analyses.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号