首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Domino effects are low‐probability high‐consequence accidents causing severe damage to humans, process plants, and the environment. Because domino effects affect large areas and are difficult to control, preventive safety measures have been given priority over mitigative measures. As a result, safety distances and safety inventories have been used as preventive safety measures to reduce the escalation probability of domino effects. However, these safety measures are usually designed considering static accident scenarios. In this study, we show that compared to a static worst‐case accident analysis, a dynamic consequence analysis provides a more rational approach for risk assessment and management of domino effects. This study also presents the application of Bayesian networks and conflict analysis to risk‐based allocation of chemical inventories to minimize the consequences and thus to reduce the escalation probability. It emphasizes the risk management of chemical inventories as an inherent safety measure, particularly in existing process plants where the applicability of other safety measures such as safety distances is limited.  相似文献   

2.
《Risk analysis》2018,38(8):1585-1600
Historical data analysis shows that escalation accidents, so‐called domino effects, have an important role in disastrous accidents in the chemical and process industries. In this study, an agent‐based modeling and simulation approach is proposed to study the propagation of domino effects in the chemical and process industries. Different from the analytical or Monte Carlo simulation approaches, which normally study the domino effect at probabilistic network levels, the agent‐based modeling technique explains the domino effects from a bottom‐up perspective. In this approach, the installations involved in a domino effect are modeled as agents whereas the interactions among the installations (e.g., by means of heat radiation) are modeled via the basic rules of the agents. Application of the developed model to several case studies demonstrates the ability of the model not only in modeling higher‐level domino effects and synergistic effects but also in accounting for temporal dependencies. The model can readily be applied to large‐scale complicated cases.  相似文献   

3.
Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed.  相似文献   

4.
In the present study, we have introduced a methodology based on graph theory and multicriteria decision analysis for cost‐effective fire protection of chemical plants subject to fire‐induced domino effects. By modeling domino effects in chemical plants as a directed graph, the graph centrality measures such as out‐closeness and betweenness scores can be used to identify the installations playing a key role in initiating and propagating potential domino effects. It is demonstrated that active fire protection of installations with the highest out‐closeness score and passive fire protection of installations with the highest betweenness score are the most effective strategies for reducing the vulnerability of chemical plants to fire‐induced domino effects. We have employed a dynamic graph analysis to investigate the impact of both the availability and the degradation of fire protection measures over time on the vulnerability of chemical plants. The results obtained from the graph analysis can further be prioritized using multicriteria decision analysis techniques such as the method of reference point to find the most cost‐effective fire protection strategy.  相似文献   

5.
Certification is an essential feature in organic farming, and it is based on inspections to verify compliance with respect to European Council Regulation—EC Reg. No 834/2007. A risk‐based approach to noncompliance that alerts the control bodies to activate planning inspections would contribute to a more efficient and cost‐effective certification system. An analysis of factors that can affect the probability of noncompliance in organic farming has thus been developed. This article examines the application of zero‐inflated count data models to farm‐level panel data from inspection results and sanctions obtained from the Ethical and Environmental Certification Institute, one of the main control bodies in Italy. We tested many a priori hypotheses related to the risk of noncompliance. We find evidence of an important role for past noncompliant behavior in predicting future noncompliance, while farm size and the occurrence of livestock also have roles in an increased probability of noncompliance. We conclude the article proposing that an efficient risk‐based inspection system should be designed, weighting up the known probability of occurrence of a given noncompliance according to the severity of its impact.  相似文献   

6.
Richard Genovesi 《Risk analysis》2012,32(12):2182-2197
Drinking water supplies are at risk of contamination from a variety of physical, chemical, and biological sources. Ranked among these threats are hazardous material releases from leaking or improperly managed underground storage tanks located at municipal, commercial, and industrial facilities. To reduce human health and environmental risks associated with the subsurface storage of hazardous materials, government agencies have taken a variety of legislative and regulatory actions—which date back more than 25 years and include the establishment of rigorous equipment/technology/operational requirements and facility‐by‐facility inspection and enforcement programs. Given a history of more than 470,000 underground storage tank releases nationwide, the U.S. Environmental Protection Agency continues to report that 7,300 new leaks were found in federal fiscal year 2008, while nearly 103,000 old leaks remain to be cleaned up. In this article, we report on an alternate evidence‐based intervention approach for reducing potential releases from the storage of petroleum products (gasoline, diesel, kerosene, heating/fuel oil, and waste oil) in underground tanks at commercial facilities located in Rhode Island. The objective of this study was to evaluate whether a new regulatory model can be used as a cost‐effective alternative to traditional facility‐by‐facility inspection and enforcement programs for underground storage tanks. We conclude that the alternative model, using an emphasis on technical assistance tools, can produce measurable improvements in compliance performance, is a cost‐effective adjunct to traditional facility‐by‐facility inspection and enforcement programs, and has the potential to allow regulatory agencies to decrease their frequency of inspections among low risk facilities without sacrificing compliance performance or increasing public health risks.  相似文献   

7.
The Oregon Department of Environmental Quality has developed a Cross-Media Comparative Risk Assessment model to address certain regulatory concerns. The model generates a Human and Ecological Risk Index for a facility releasing toxins into the environment. The risk indices are based on chemical fate and transport predictions, toxicity, population density, and ecological sensitive areas. The model can be used to rank facilities for inspection or as a tool to assess the progress of pollution prevention programs. Regulatory permitting departments can use the model to address the cross-media transfer of pollutants from one environmental compartment to another. The versatility of the model allows adaptation to each specific users needs.  相似文献   

8.
Natural disasters are the cause of a sizeable number of hazmat releases, referred to as “natechs.” An enhanced understanding of natech probability, allowing for predictions of natech occurrence, is an important step in determining how industry and government should mitigate natech risk. This study quantifies the conditional probabilities of natechs at TRI/RMP and SICS 1311 facilities given the occurrence of hurricanes, earthquakes, tornadoes, and floods. During hurricanes, a higher probability of releases was observed due to storm surge (7.3 releases per 100 TRI/RMP facilities exposed vs. 6.2 for SIC 1311) compared to category 1–2 hurricane winds (5.6 TRI, 2.6 SIC 1311). Logistic regression confirms the statistical significance of the greater propensity for releases at RMP/TRI facilities, and during some hurricanes, when controlling for hazard zone. The probability of natechs at TRI/RMP facilities during earthquakes increased from 0.1 releases per 100 facilities at MMI V to 21.4 at MMI IX. The probability of a natech at TRI/RMP facilities within 25 miles of a tornado was small (~0.025 per 100 facilities), reflecting the limited area directly affected by tornadoes. Areas inundated during flood events had a probability of 1.1 releases per 100 facilities but demonstrated widely varying natech occurrence during individual events, indicating that factors not quantified in this study such as flood depth and speed are important for predicting flood natechs. These results can inform natech risk analysis, aid government agencies responsible for planning response and remediation after natural disasters, and should be useful in raising awareness of natech risk within industry.  相似文献   

9.
Public opinion poll data have consistently shown that the proportion of respondents who are willing to have a nuclear power plant in their own community is smaller than the proportion who agree that more nuclear plants should be built in this country. Respondents' judgments of the minimum safe distance from each of eight hazardous facilities confirmed that this finding results from perceived risk gradients that differ by facility (e.g., nuclear vs. natural gas power plants) and social group (e.g., chemical engineers vs. environmentalists) but are relatively stable over time. Ratings of the facilities on thirteen perceived risk dimensions were used to determine whether any of the dimensions could explain the distance data. Because the rank order of the facilities with respect to acceptable distance was very similar to the rank order on a number of the perceived risk dimensions, it is difficult to determine which of the latter is the critical determinant of acceptable distance if, indeed, there is only one. There were, however, a number of reversals of rank order that indicate that the respondents had a differentiated view of technological risk. Finally, data from this and other studies were interpreted as suggesting that perceived lack of any other form of personal control over risk exposure may be an important factor in stimulating public opposition to the siting of hazardous facilities.  相似文献   

10.
A challenge with multiple chemical risk assessment is the need to consider the joint behavior of chemicals in mixtures. To address this need, pharmacologists and toxicologists have developed methods over the years to evaluate and test chemical interaction. In practice, however, testing of chemical interaction more often comprises ad hoc binary combinations and rarely examines higher order combinations. One explanation for this practice is the belief that there are simply too many possible combinations of chemicals to consider. Indeed, under stochastic conditions the possible number of chemical combinations scales geometrically as the pool of chemicals increases. However, the occurrence of chemicals in the environment is determined by factors, economic in part, which favor some chemicals over others. We investigate methods from the field of biogeography, originally developed to study avian species co‐occurrence patterns, and adapt these approaches to examine chemical co‐occurrence. These methods were applied to a national survey of pesticide residues in 168 child care centers from across the country. Our findings show that pesticide co‐occurrence in the child care center was not random but highly structured, leading to the co‐occurrence of specific pesticide combinations. Thus, ecological studies of species co‐occurrence parallel the issue of chemical co‐occurrence at specific locations. Both are driven by processes that introduce structure in the pattern of co‐occurrence. We conclude that the biogeographical tools used to determine when this structure occurs in ecological studies are relevant to evaluations of pesticide mixtures for exposure and risk assessment.  相似文献   

11.
Domino Effect Analysis Using Bayesian Networks   总被引:1,自引:0,他引:1  
A new methodology is introduced based on Bayesian network both to model domino effect propagation patterns and to estimate the domino effect probability at different levels. The flexible structure and the unique modeling techniques offered by Bayesian network make it possible to analyze domino effects through a probabilistic framework, considering synergistic effects, noisy probabilities, and common cause failures. Further, the uncertainties and the complex interactions among the domino effect components are captured using Bayesian network. The probabilities of events are updated in the light of new information, and the most probable path of the domino effect is determined on the basis of the new data gathered. This study shows how probability updating helps to update the domino effect model either qualitatively or quantitatively. The methodology is applied to a hypothetical example and also to an earlier‐studied case study. These examples accentuate the effectiveness of Bayesian network in modeling domino effects in processing facility.  相似文献   

12.
The objective of meat inspection is to promote animal and public health by preventing, detecting, and controlling hazards originating from animals. With the improvements of sanitary level in pig herds, the hazards profile has shifted and the inspection procedures no longer target major foodborne pathogens (i.e., not risk based). Additionally, carcass manipulations performed when searching for macroscopic lesions can lead to cross‐contamination. We therefore developed a stochastic model to quantitatively describe cross‐contamination when consecutive carcasses are submitted to classic inspection procedures. The microbial hazard used to illustrate the model was Salmonella, the data set was obtained from Brazilian slaughterhouses, and some simplifying assumptions were made. The model predicted that due to cross‐contamination during inspection, the prevalence of contaminated carcass surfaces increased from 1.2% to 95.7%, whereas the mean contamination on contaminated surfaces decreased from 1 logCFU/cm² to ?0.87 logCFU/cm², and the standard deviations decreased from 0.65 to 0.19. These results are explained by the fact that, due to carcass manipulations with hands, knives, and hooks, including the cutting of contaminated lymph nodes, Salmonella is transferred to previously uncontaminated carcasses, but in small quantities. These small quantities can easily go undetected during sampling. Sensitivity analyses gave insight into the model performance and showed that the touching and cutting of lymph nodes during inspection can be an important source of carcass contamination. The model can serve as a tool to support discussions on the modernization of pig carcass inspection.  相似文献   

13.
Abstract

Finding the utilization of a set of machines in a shop floor of a production system by the ratio delay method is a well-known practice in conventional time and motion studies. In this method, the desired reliability of the study is preset and subsequently the data are collected to achieve that reliability. I do not continuously review the reliability in this method while conducting the study nor do I measure the benefit-cost ratio of the undergoing study. In this paper, I describe a method to determine the utilization of a set of production facilities and continuously review the reliability achieved at the end of each survey period. The random observations about the machine status generate a utilization profile of a set of machines on a continuous time scale. Based on the benefit-cost ratio, this method provides us with an information as to when the study has to be stopped in a real time situation. The study is substantiated with a case study and an indication of further research.  相似文献   

14.
How can we best allocate limited defensive resources to reduce terrorism risks? Dillon et al.'s Antiterrorism Risk-Based Decision Aid (ARDA) system provides a useful point of departure for addressing this crucial question by exhibiting a real-world system that calculates risk reduction scores for different portfolios of risk-reducing countermeasures and using them to rank-order different possible risk mitigation alternatives for Navy facilities. This comment points out some potential limitations of any scoring system that does not take into account risk externalities, interdependencies among threats, uncertainties that are correlated across targets, and attacker responses to alternative allocations of defensive resources. In at least some simple situations, allocations based on risk reduction scores and comparisons can inadvertently increase risks by providing intelligent attackers with valuable information, or they can fail to reduce risks as effectively as nonscoring, optimization-based approaches. These limitations of present scoring methods present exciting technical challenges and opportunities for risk analysts to develop improved methods for protecting facilities and infrastructure against terrorist threats.  相似文献   

15.
The aging domestic oil production infrastructure represents a high risk to the environment because of the type of fluids being handled (oil and brine) and the potential for accidental release of these fluids into sensitive ecosystems. Currently, there is not a quantitative risk model directly applicable to onshore oil exploration and production (E&P) facilities. We report on a probabilistic reliability model created for onshore exploration and production (E&P) facilities. Reliability theory, failure modes and effects analysis (FMEA), and event trees were used to develop the model estimates of the failure probability of typical oil production equipment. Monte Carlo simulation was used to translate uncertainty in input parameter values to uncertainty in the model output. The predicted failure rates were calibrated to available failure rate information by adjusting probability density function parameters used as random variates in the Monte Carlo simulations. The mean and standard deviation of normal variate distributions from which the Weibull distribution characteristic life was chosen were used as adjustable parameters in the model calibration. The model was applied to oil production leases in the Tallgrass Prairie Preserve, Oklahoma. We present the estimated failure probability due to the combination of the most significant failure modes associated with each type of equipment (pumps, tanks, and pipes). The results show that the estimated probability of failure for tanks is about the same as that for pipes, but that pumps have much lower failure probability. The model can provide necessary equipment reliability information for proactive risk management at the lease level by providing quantitative information to base allocation of maintenance resources to high-risk equipment that will minimize both lost production and ecosystem damage.  相似文献   

16.
The JFDA applies border control for Salmonella Typhimurium and Salmonella Enteritidis in frozen poultry products. A QMRA model was developed to evaluate the effectiveness of this system in controlling the risk for consumers. The model consists of three modules; consumer phase, risk estimation, and risk reduction. The model inputs were the occurrence of Salmonella in different types of imported poultry products, the LOD of the Rapid’Salmonella, the number of tested samples of each batch, and the criteria for rejection. The model outputs were public health impact as the Minimum Relative Residual Risk (MRRR) given the batches’ refusal and the percentage of Batches that are Not-compliant with the Microbiological Criteria (BNMC) of rejection. To estimate the overall MRRR of the border control, the estimated country and product-specific MRRR were summarized and weighted by the total imports of each product from each country. The current border control based on one sample per batch gives an overall MRRR value of 27%. The alternative scenarios based on three and five samples per batch are 12% and 8%, respectively. Overall, the higher the prevalence and/or concentration of Salmonella in imported products, the more the likelihood that batches will be rejected. For products with up-to-date data of occurrence, the estimated BNMC was similar to the observed proportion of rejected batches. The lack of data on the Salmonella concentrations in poultry products from different countries is the major source of the uncertainties in the model. It reduces our opportunities to obtain valid estimates of the absolute risk.  相似文献   

17.
The purpose of this investigation was to estimate excess lifetime risk of lung cancer death resulting from occupational exposure to hexavalent-chromium-containing dusts and mists. The mortality experience in a previously studied cohort of 2,357 chromate chemical production workers with 122 lung cancer deaths was analyzed with Poisson regression methods. Extensive records of air samples evaluated for water-soluble total hexavalent chromium were available for the entire employment history of this cohort. Six different models of exposure-response for hexavalent chromium were evaluated by comparing deviances and inspection of cubic splines. Smoking (pack-years) imputed from cigarette use at hire was included in the model. Lifetime risks of lung cancer death from exposure to hexavalent chromium (assuming up to 45 years of exposure) were estimated using an actuarial calculation that accounts for competing causes of death. A linear relative rate model gave a good and readily interpretable fit to the data. The estimated rate ratio for 1 mg/m3-yr of cumulative exposure to hexavalent chromium (as CrO3), with a lag of five years, was RR=2.44 (95% CI=1.54-3.83). The excess lifetime risk of lung cancer death from exposure to hexavalent chromium at the current OSHA permissible exposure limit (PEL) (0.10 mg/m3) was estimated to be 255 per 1,000 (95% CI: 109-416). This estimate is comparable to previous estimates by U.S. EPA, California EPA, and OSHA using different occupational data. Our analysis predicts that current occupational standards for hexavalent chromium permit a lifetime excess risk of dying of lung cancer that exceeds 1 in 10, which is consistent with previous risk assessments.  相似文献   

18.
Helicobacter pylori is a microaerophilic, gram‐negative bacterium that is linked to adverse health effects including ulcers and gastrointestinal cancers. The goal of this analysis is to develop the necessary inputs for a quantitative microbial risk assessment (QMRA) needed to develop a potential guideline for drinking water at the point of ingestion (e.g., a maximum contaminant level, or MCL) that would be protective of human health to an acceptable level of risk while considering sources of uncertainty. Using infection and gastric cancer as two discrete endpoints, and calculating dose‐response relationships from experimental data on humans and monkeys, we perform both a forward and reverse risk assessment to determine the risk from current reported surface water concentrations of H. pylori and an acceptable concentration of H. pylori at the point of ingestion. This approach represents a synthesis of available information on human exposure to H. pylori via drinking water. A lifetime risk of cancer model suggests that a MCL be set at <1 organism/L given a 5‐log removal treatment because we cannot exclude the possibility that current levels of H. pylori in environmental source waters pose a potential public health risk. Research gaps include pathogen occurrence in source and finished water, treatment removal rates, and determination of H. pylori risks from other water sources such as groundwater and recreational water.  相似文献   

19.
In this article, we analyze a location model where facilities may be subject to disruptions. Customers do not have advance information about whether a given facility is operational or not, and thus may have to visit several facilities before finding an operational one. The objective is to locate a set of facilities to minimize the total expected cost of customer travel. We decompose the total cost into travel, reliability, and information components. This decomposition allows us to put a value on the advance information about the states of facilities and compare it to the reliability and travel cost components, which allows a decision maker to evaluate which part of the system would benefit the most from improvements. The structure of optimal solutions is analyzed, with two interesting effects identified: facility centralization and co‐location; both effects appear to be stronger than in the complete information case, where the status of each facility is known in advance.  相似文献   

20.
Process plants deal with hazardous (highly flammable and toxic) chemicals at extreme conditions of temperature and pressure. Proper inspection and maintenance of these facilities is paramount for the maintenance of safe and continuous operation. This article proposes a risk-based methodology for integrity and inspection modeling (RBIIM) to ensure safe and fault-free operation of the facility. This methodology uses a gamma distribution to model the material degradation and a Bayesian updating method to improve the distribution based on actual inspection results. The method deals with the two cases of perfect and imperfect inspections. The measurement error resulting from imperfect inspections is modeled as a zero-mean, normally distributed random process. The risk is calculated using the probability of failure and the consequence is assessed in terms of cost as a function of time. The risk function is used to determine an optimal inspection and replacement interval. The calculated inspection and replacement interval is subsequently used in the design of an integrity inspection plan. Two case studies are presented: the maintenance of an autoclave and the maintenance of a pipeline segment. For the autoclave, the interval between two successive inspections is found to be 19 years. For the pipeline, the next inspection is due after 5 years from now. Measurements taken at inspections are used in estimating a new degradation rate that can then be used to update the failure distribution function.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号