首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 83 毫秒
1.
Major industrial accidents occurring at so-called major hazard installations may cause domino accidents which are among the most destructive industrial accidents existing at present. As there may be many hazard installations in an area, a primary accident scenario may potentially propagate from one installation to another, and correlations exist in probability calculations of domino effects. In addition, during the propagation of a domino effect, accidents of diverse types may occur, some of them having a synergistic effect, while others do not. These characteristics make the analytical formulation of domino accidents very complex. In this work, a simple matrix-based modeling approach for domino effect analysis is proposed. Matrices can be used to represent the mutual influences of different escalation vectors between installations. On this basis, an analysis approach for accident propagation as well as a simulation-based algorithm for probability calculation of accidents and accident levels is provided. The applicability and flexibility of this approach is discussed while applying it to estimate domino probabilities in a case study.  相似文献   

2.
In the present study, we have introduced a methodology based on graph theory and multicriteria decision analysis for cost‐effective fire protection of chemical plants subject to fire‐induced domino effects. By modeling domino effects in chemical plants as a directed graph, the graph centrality measures such as out‐closeness and betweenness scores can be used to identify the installations playing a key role in initiating and propagating potential domino effects. It is demonstrated that active fire protection of installations with the highest out‐closeness score and passive fire protection of installations with the highest betweenness score are the most effective strategies for reducing the vulnerability of chemical plants to fire‐induced domino effects. We have employed a dynamic graph analysis to investigate the impact of both the availability and the degradation of fire protection measures over time on the vulnerability of chemical plants. The results obtained from the graph analysis can further be prioritized using multicriteria decision analysis techniques such as the method of reference point to find the most cost‐effective fire protection strategy.  相似文献   

3.
Domino effects are low‐probability high‐consequence accidents causing severe damage to humans, process plants, and the environment. Because domino effects affect large areas and are difficult to control, preventive safety measures have been given priority over mitigative measures. As a result, safety distances and safety inventories have been used as preventive safety measures to reduce the escalation probability of domino effects. However, these safety measures are usually designed considering static accident scenarios. In this study, we show that compared to a static worst‐case accident analysis, a dynamic consequence analysis provides a more rational approach for risk assessment and management of domino effects. This study also presents the application of Bayesian networks and conflict analysis to risk‐based allocation of chemical inventories to minimize the consequences and thus to reduce the escalation probability. It emphasizes the risk management of chemical inventories as an inherent safety measure, particularly in existing process plants where the applicability of other safety measures such as safety distances is limited.  相似文献   

4.
Domino Effect Analysis Using Bayesian Networks   总被引:1,自引:0,他引:1  
A new methodology is introduced based on Bayesian network both to model domino effect propagation patterns and to estimate the domino effect probability at different levels. The flexible structure and the unique modeling techniques offered by Bayesian network make it possible to analyze domino effects through a probabilistic framework, considering synergistic effects, noisy probabilities, and common cause failures. Further, the uncertainties and the complex interactions among the domino effect components are captured using Bayesian network. The probabilities of events are updated in the light of new information, and the most probable path of the domino effect is determined on the basis of the new data gathered. This study shows how probability updating helps to update the domino effect model either qualitatively or quantitatively. The methodology is applied to a hypothetical example and also to an earlier‐studied case study. These examples accentuate the effectiveness of Bayesian network in modeling domino effects in processing facility.  相似文献   

5.
Security risk management is essential for ensuring effective airport operations. This article introduces AbSRiM, a novel agent‐based modeling and simulation approach to perform security risk management for airport operations that uses formal sociotechnical models that include temporal and spatial aspects. The approach contains four main steps: scope selection, agent‐based model definition, risk assessment, and risk mitigation. The approach is based on traditional security risk management methodologies, but uses agent‐based modeling and Monte Carlo simulation at its core. Agent‐based modeling is used to model threat scenarios, and Monte Carlo simulations are then performed with this model to estimate security risks. The use of the AbSRiM approach is demonstrated with an illustrative case study. This case study includes a threat scenario in which an adversary attacks an airport terminal with an improvised explosive device. The approach provides a promising way to include important elements, such as human aspects and spatiotemporal aspects, in the assessment of risk. More research is still needed to better identify the strengths and weaknesses of the AbSRiM approach in different case studies, but results demonstrate the feasibility of the approach and its potential.  相似文献   

6.
Data mining (DM) has been applied in many advanced science and technology fields, but it has still not been used for domino effect risk management to explore minimum risk scenarios. This work investigates the feasibility of DM in minimizing the risk of fire-induced domino effects in chemical processing facilities. Based on DM, an evidential failure mode and effects analysis (E-FMEA), which could bridge chemical facilities’ operational reliability and domino effect risk, is combined with fault tree analysis (FTA) for the occurrence risk modeling of loss of containment (LOC) event of chemical facilities, which is often the triggering point of fire-induced domino effects. Industry specific data such as reliability data, inspection records, and maintenance records are of great value to model the potential occurrence criticality of LOC. The data are used to characterize the LOC risk priority number (RPN) of chemical facilities through FTA and E-FMEA, search and statistics rules are proposed to mine inspection records to assess LOC risk factors. According to the RPN scores of facilities, inherent safety strategies to minimize risk via inventory control are proposed, and their effectiveness is tested using a well-known probit model. In this way, the approach proposes a unit-specific evidence-based risk minimization strategy for fire-induced domino effects. A case study demonstrates the capability of DM in the risk minimization of fire-induced domino effects.  相似文献   

7.
In this article, an agent‐based framework to quantify the seismic resilience of an electric power supply system (EPSS) and the community it serves is presented. Within the framework, the loss and restoration of the EPSS power generation and delivery capacity and of the power demand from the served community are used to assess the electric power deficit during the damage absorption and recovery processes. Damage to the components of the EPSS and of the community‐built environment is evaluated using the seismic fragility functions. The restoration of the community electric power demand is evaluated using the seismic recovery functions. However, the postearthquake EPSS recovery process is modeled using an agent‐based model with two agents, the EPSS Operator and the Community Administrator. The resilience of the EPSS–community system is quantified using direct, EPSS‐related, societal, and community‐related indicators. Parametric studies are carried out to quantify the influence of different seismic hazard scenarios, agent characteristics, and power dispatch strategies on the EPSS–community seismic resilience. The use of the agent‐based modeling framework enabled a rational formulation of the postearthquake recovery phase and highlighted the interaction between the EPSS and the community in the recovery process not quantified in resilience models developed to date. Furthermore, it shows that the resilience of different community sectors can be enhanced by different power dispatch strategies. The proposed agent‐based EPSS–community system resilience quantification framework can be used to develop better community and infrastructure system risk governance policies.  相似文献   

8.
Many chemicals interfere with the natural reproductive processes in mammals. The chemicals may prevent the fertilization of an egg or keep a zygote from implanting in the uterine wall. For this reason, toxicology studies with pre-implantation exposure often exhibit a dose-related trend in the number of observed implantations per litter. Standard methods for analyzing developmental toxicology studies are conditioned on the number of implantations in the litter and therefore cannot estimate this effect of the chemical on the reproductive process. This article presents a joint modeling approach to estimating risk in toxicology studies with pre-implantation exposure. In the joint modeling approach, both the number of implanted fetuses and the outcome of each implanted fetus is modeled. Using this approach we show how to estimate the overall risk of a chemical that incorporates the risk of lost implantation due to pre-implantation exposure. Our approach has several distinct advantages over previous methods: (1) it is based on fitting a model for the observed data and, therefore, diagnostics of model fit and selection apply; (2) all assumptions are explicitly stated; and (3) it can be fit using standard software packages We illustrate our approach by analyzing a dominant lethal assay data set (Luning et al., 1966, Mutation Research, 3, 444-451) and compare ourresults with those of Rai and Van Ryzin (1985, Biometrics, 41,1-9) and Dunson (1998, Biometrics, 54, 558-569). In a simulation study, our approach has smaller bias and variance than the multiple imputation procedure of Dunson.  相似文献   

9.
Within risk analysis and, more broadly, the decision behind the choice of which modeling technique to use to study the spread of disease, epidemics, fires, technology, rumors, or, more generally, spatial dynamics, is not well documented. While individual models are well defined and the modeling techniques are well understood by practitioners, there is little deliberate choice made as to the type of model to be used, with modelers using techniques that are well accepted in the field, sometimes with little thought as to whether alternative modeling techniques could or should be used. In this article, we divide modeling techniques for spatial transmission into four main categories: population‐level models, where a macro‐level estimate of the infected population is required; cellular models, where the transmission takes place between connected domains, but is restricted to a fixed topology of neighboring cells; network models, where host‐to‐host transmission routes are modeled, either as planar spatial graphs or where shortcuts can take place as in social networks; and, finally, agent‐based models that model the local transmission between agents, either as host‐to‐host geographical contacts, or by modeling the movement of the disease vector, with dynamic movement of hosts and vectors possible, on a Euclidian space or a more complex space deformed by the existence of information about the topology of the landscape. We summarize these techniques by introducing a taxonomy classifying these modeling approaches. Finally, we present a framework for choosing the most appropriate spatial modeling method, highlighting the links between seemingly disparate methodologies, bearing in mind that the choice of technique rests with the subject expert.  相似文献   

10.
Emergency response is directly related to the allocation of emergency rescue resources. Efficient emergency response can reduce loss of life and property, limit damage from the primary impact, and minimize damage from derivative impacts. An appropriate risk analysis approach in the event of accidents is one rational way to assist emergency response. In this article, a cellular automata‐based systematic approach for conducting risk analysis in emergency response is presented. Three general rules, i.e., diffusive effect, transporting effect, and dissipative effect, are developed to implement cellular automata transition function. The approach takes multiple social factors such as population density and population sensitivity into consideration and it also considers risk of domino accidents that are increasing due to increasing congestion in industrial complexes of a city and increasing density of human population. In addition, two risk indices, i.e., individual risk and aggregated weighted risk, are proposed to assist decision making for emergency managers during emergency response. Individual risk can be useful to plan evacuation strategies, while aggregated weighted risk can help emergency managers to allocate rescue resources rationally according to the degree of danger in each vulnerable area and optimize emergency response programs.  相似文献   

11.
This article presents ongoing research that focuses on efficient allocation of defense resources to minimize the damage inflicted on a spatially distributed physical network such as a pipeline, water system, or power distribution system from an attack by an active adversary, recognizing the fundamental difference between preparing for natural disasters such as hurricanes, earthquakes, or even accidental systems failures and the problem of allocating resources to defend against an opponent who is aware of, and anticipating, the defender's efforts to mitigate the threat. Our approach is to utilize a combination of integer programming and agent‐based modeling to allocate the defensive resources. We conceptualize the problem as a Stackelberg “leader follower” game where the defender first places his assets to defend key areas of the network, and the attacker then seeks to inflict the maximum damage possible within the constraints of resources and network structure. The criticality of arcs in the network is estimated by a deterministic network interdiction formulation, which then informs an evolutionary agent‐based simulation. The evolutionary agent‐based simulation is used to determine the allocation of resources for attackers and defenders that results in evolutionary stable strategies, where actions by either side alone cannot increase its share of victories. We demonstrate these techniques on an example network, comparing the evolutionary agent‐based results to a more traditional, probabilistic risk analysis (PRA) approach. Our results show that the agent‐based approach results in a greater percentage of defender victories than does the PRA‐based approach.  相似文献   

12.
In the event of contamination of a water distribution system, decisions must be made to mitigate the impact of the contamination and to protect public health. Making threat management decisions while a contaminant spreads through the network is a dynamic and interactive process. Response actions taken by the utility managers and water consumption choices made by the consumers will affect the hydraulics, and thus the spread of the contaminant plume, in the network. A modeling framework that allows the simulation of a contamination event under the effects of actions taken by utility managers and consumers will be a useful tool for the analysis of alternative threat mitigation and management strategies. This article presents a multiagent modeling framework that combines agent‐based, mechanistic, and dynamic methods. Agents select actions based on a set of rules that represent an individual's autonomy, goal‐based desires, and reaction to the environment and the actions of other agents. Consumer behaviors including ingestion, mobility, reduction of water demands, and word‐of‐mouth communication are simulated. Management strategies are evaluated, including opening hydrants to flush the contaminant and broadcasts. As actions taken by consumer agents and utility operators affect demands and flows in the system, the mechanistic model is updated. Management strategies are evaluated based on the exposure of the population to the contaminant. The framework is designed to consider the typical issues involved in water distribution threat management and provides valuable analysis of threat containment strategies for water distribution system contamination events.  相似文献   

13.
Terrorism could be treated as a hazard for design purposes. For instance, the terrorist hazard could be analyzed in a manner similar to the way that seismic hazard is handled. No matter how terrorism is dealt with in the design of systems, the need for predictions of the frequency and magnitude of the hazard will be required. And, if the human‐induced hazard is to be designed for in a manner analogous to natural hazards, then the predictions should be probabilistic in nature. The model described in this article is a prototype model that used agent‐based modeling (ABM) to analyze terrorist attacks. The basic approach in this article of using ABM to model human‐induced hazards has been preliminarily validated in the sense that the attack magnitudes seem to be power‐law distributed and attacks occur mostly in regions where high levels of wealth pass through, such as transit routes and markets. The model developed in this study indicates that ABM is a viable approach to modeling socioeconomic‐based infrastructure systems for engineering design to deal with human‐induced hazards.  相似文献   

14.
Estimation of benchmark doses (BMDs) in quantitative risk assessment traditionally is based upon parametric dose‐response modeling. It is a well‐known concern, however, that if the chosen parametric model is uncertain and/or misspecified, inaccurate and possibly unsafe low‐dose inferences can result. We describe a nonparametric approach for estimating BMDs with quantal‐response data based on an isotonic regression method, and also study use of corresponding, nonparametric, bootstrap‐based confidence limits for the BMD. We explore the confidence limits’ small‐sample properties via a simulation study, and illustrate the calculations with an example from cancer risk assessment. It is seen that this nonparametric approach can provide a useful alternative for BMD estimation when faced with the problem of parametric model uncertainty.  相似文献   

15.
The ability to accurately forecast and control inpatient census, and thereby workloads, is a critical and long‐standing problem in hospital management. The majority of current literature focuses on optimal scheduling of inpatients, but largely ignores the process of accurate estimation of the trajectory of patients throughout the treatment and recovery process. The result is that current scheduling models are optimizing based on inaccurate input data. We developed a Clustering and Scheduling Integrated (CSI) approach to capture patient flows through a network of hospital services. CSI functions by clustering patients into groups based on similarity of trajectory using a novel semi‐Markov model (SMM)‐based clustering scheme, as opposed to clustering by patient attributes as in previous literature. Our methodology is validated by simulation and then applied to real patient data from a partner hospital where we demonstrate that it outperforms a suite of well‐established clustering methods. Furthermore, we demonstrate that extant optimization methods achieve significantly better results on key hospital performance measures under CSI, compared with traditional estimation approaches, increasing elective admissions by 97% and utilization by 22% compared to 30% and 8% using traditional estimation techniques. From a theoretical standpoint, the SMM‐clustering is a novel approach applicable to any temporal‐spatial stochastic data that is prevalent in many industries and application areas.  相似文献   

16.
In evaluating the risk of exposure to health hazards, characterizing the dose‐response relationship and estimating acceptable exposure levels are the primary goals. In analyses of health risks associated with exposure to ionizing radiation, while there is a clear agreement that moderate to high radiation doses cause harmful effects in humans, little has been known about the possible biological effects at low doses, for example, below 0.1 Gy, which is the dose range relevant to most radiation exposures of concern today. A conventional approach to radiation dose‐response estimation based on simple parametric forms, such as the linear nonthreshold model, can be misleading in evaluating the risk and, in particular, its uncertainty at low doses. As an alternative approach, we consider a Bayesian semiparametric model that has a connected piece‐wise‐linear dose‐response function with prior distributions having an autoregressive structure among the random slope coefficients defined over closely spaced dose categories. With a simulation study and application to analysis of cancer incidence data among Japanese atomic bomb survivors, we show that this approach can produce smooth and flexible dose‐response estimation while reasonably handling the risk uncertainty at low doses and elsewhere. With relatively few assumptions and modeling options to be made by the analyst, the method can be particularly useful in assessing risks associated with low‐dose radiation exposures.  相似文献   

17.
Multi‐organizational collaborative decision making in high‐magnitude crisis situations requires real‐time information sharing and dynamic modeling for effective response. Information technology (IT) based decision support tools can play a key role in facilitating such effective response. We explore one promising class of decision support tools based on machine learning, known as support vector machines (SVM), which have the capability to dynamically model and analyze decision processes. To examine this capability, we use a case study with a design science approach to evaluate improved decision‐making effectiveness of an SVM algorithm in an agent‐based simulation experimental environment. Testing and evaluation of real‐time decision support tools in simulated environments provides an opportunity to assess their value under various dynamic conditions. Decision making in high‐magnitude crisis situations involves multiple different patterns of behavior, requiring the development, application, and evaluation of different models. Therefore, we employ a multistage linear support vector machine (MLSVM) algorithm that permits partitioning decision maker response into behavioral subsets, which can then individually model and examine their diverse patterns of response behavior. The results of our case study indicate that our MLSVM is clearly superior to both single stage SVMs and traditional approaches such as linear and quadratic discriminant analysis for understanding and predicting behavior. We conclude that machine learning algorithms show promise for quickly assessing response strategy behavior and for providing the capability to share information with decision makers in multi‐organizational collaborative environments, thus supporting more effective decision making in such contexts.  相似文献   

18.
Reference values, including an oral reference dose (RfD) and an inhalation reference concentration (RfC), were derived for propylene glycol methyl ether (PGME), and an oral RfD was derived for its acetate (PGMEA). These values were based on transient sedation observed in F344 rats and B6C3F1 mice during a two‐year inhalation study. The dose‐response relationship for sedation was characterized using internal dose measures as predicted by a physiologically‐based pharmacokinetic (PBPK) model for PGME and its acetate. PBPK modeling was used to account for changes in rodent physiology and metabolism due to aging and adaptation, based on data collected during Weeks 1, 2, 26, 52, and 78 of a chronic inhalation study. The peak concentration of PGME in richly perfused tissues (i.e., brain) was selected as the most appropriate internal dose measure based on a consideration of the mode of action for sedation and similarities in tissue partitioning between brain and other richly perfused tissues. Internal doses (peak tissue concentrations of PGME) were designated as either no‐observed‐adverse‐effect levels (NOAELs) or lowest‐observed‐adverse‐effect levels (LOAELs) based on the presence or the absence of sedation at each time point, species, and sex in the two‐year study. Distributions of the NOAEL and LOAEL values expressed in terms of internal dose were characterized using an arithmetic mean and standard deviation, with the mean internal NOAEL serving as the basis for the reference values, which was then divided by appropriate uncertainty factors. Where data were permitting, chemical‐specific adjustment factors were derived to replace default uncertainty factor values of 10. Nonlinear kinetics, which was predicted by the model in all species at PGME concentrations exceeding 100 ppm, complicate interspecies, and low‐dose extrapolations. To address this complication, reference values were derived using two approaches that differ with respect to the order in which these extrapolations were performed: (1) default approach of interspecies extrapolation to determine the human equivalent concentration (PBPK modeling) followed by uncertainty factor application, and (2) uncertainty factor application followed by interspecies extrapolation (PBPK modeling). The resulting reference values for these two approaches are substantially different, with values from the latter approach being seven‐fold higher than those from the former approach. Such a striking difference between the two approaches reveals an underlying issue that has received little attention in the literature regarding the application of uncertainty factors and interspecies extrapolations to compounds where saturable kinetics occur in the range of the NOAEL. Until such discussions have taken place, reference values based on the former approach are recommended for risk assessments involving human exposures to PGME and PGMEA.  相似文献   

19.
Water reuse can serve as a sustainable alternative water source for urban areas. However, the successful implementation of large‐scale water reuse projects depends on community acceptance. Because of the negative perceptions that are traditionally associated with reclaimed water, water reuse is often not considered in the development of urban water management plans. This study develops a simulation model for understanding community opinion dynamics surrounding the issue of water reuse, and how individual perceptions evolve within that context, which can help in the planning and decision‐making process. Based on the social amplification of risk framework, our agent‐based model simulates consumer perceptions, discussion patterns, and their adoption or rejection of water reuse. The model is based on the “risk publics” model, an empirical approach that uses the concept of belief clusters to explain the adoption of new technology. Each household is represented as an agent, and parameters that define their behavior and attributes are defined from survey data. Community‐level parameters—including social groups, relationships, and communication variables, also from survey data—are encoded to simulate the social processes that influence community opinion. The model demonstrates its capabilities to simulate opinion dynamics and consumer adoption of water reuse. In addition, based on empirical data, the model is applied to investigate water reuse behavior in different regions of the United States. Importantly, our results reveal that public opinion dynamics emerge differently based on membership in opinion clusters, frequency of discussion, and the structure of social networks.  相似文献   

20.
《Risk analysis》2018,38(7):1444-1454
The performance of fire protection measures plays a key role in the prevention and mitigation of fire escalation (fire domino effect) in process plants. In addition to passive and active safety measures, the intervention of firefighting teams can have a great impact on fire propagation. In the present study, we have demonstrated an application of dynamic Bayesian network to modeling and safety assessment of fire domino effect in oil terminals while considering the effect of safety measures in place. The results of the developed dynamic Bayesian network—prior and posterior probabilities—have been combined with information theory, in the form of mutual information, to identify optimal firefighting strategies, especially when the number of fire trucks is not sufficient to handle all the vessels in danger.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号