首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Point source pollution is one of the main threats to regional environmental health. Based on a water quality model, a methodology to assess the regional risk of point source pollution is proposed. The assessment procedure includes five parts: (1) identifying risk source units and estimating source emissions using Monte Carlo algorithms; (2) observing hydrological and water quality data of the assessed area, and evaluating the selected water quality model; (3) screening out the assessment endpoints and analyzing receptor vulnerability with the Choquet fuzzy integral algorithm; (4) using the water quality model introduced in the second step to predict pollutant concentrations for various source emission scenarios and analyzing hazards of risk sources; and finally, (5) using the source hazard values and receptor vulnerability scores to estimate overall regional risk. The proposed method, based on the Water Quality Analysis Simulation Program (WASP), was applied in the region of the Taipu River, which is in the Taihu Basin, China. Results of source hazard and receptor vulnerability analysis allowed us to describe aquatic ecological, human health, and socioeconomic risks individually, and also integrated risks in the Taipu region, from a series of risk curves. Risk contributions of sources to receptors were ranked, and the spatial distribution of risk levels was presented. By changing the input conditions, we were able to estimate risks for a range of scenarios. Thus, the proposed procedure may also be used by decisionmakers for long‐term dynamic risk prediction.  相似文献   

2.
In the nuclear power industry, Level 3 probabilistic risk assessment (PRA) is used to estimate damage to public health and the environment if a severe accident leads to large radiological release. Current Level 3 PRA does not have an explicit inclusion of social factors and, therefore, it is not possible to perform importance ranking of social factors for risk‐informing emergency preparedness, planning, and response (EPPR). This article offers a methodology for adapting the concept of social vulnerability, commonly used in natural hazard research, in the context of a severe nuclear power plant accident. The methodology has four steps: (1) calculating a hazard‐independent social vulnerability index for the local population; (2) developing a location‐specific representation of the maximum radiological hazard estimated from current Level 3 PRA, in a geographic information system (GIS) environment; (3) developing a GIS‐based socio‐technical risk map by combining the social vulnerability index and the location‐specific radiological hazard; and (4) conducting a risk importance measure analysis to rank the criticality of social factors based on their contribution to the socio‐technical risk. The methodology is applied using results from the 2012 Surry Power Station state‐of‐the‐art reactor consequence analysis. A radiological hazard model is generated from MELCOR accident consequence code system, translated into a GIS environment, and combined with the Center for Disease Control social vulnerability index (SVI). This research creates an opportunity to explicitly consider and rank the criticality of location‐specific SVI themes based on their influence on risk, providing input for EPPR.  相似文献   

3.
The use of appropriate approaches to produce risk maps is critical in landslide disaster management. The aim of this study was to investigate and compare the stability index mapping (SINMAP) and the spatial multicriteria evaluation (SMCE) models for landslide risk modeling in Rwanda. The SINMAP used the digital elevation model in conjunction with physical soil parameters to determine the factor of safety. The SMCE method used six layers of landslide conditioning factors. In total, 155 past landslide locations were used for training and model validation. The results showed that the SMCE performed better than the SINMAP model. Thus, the receiver operating characteristic and three statistical estimators—accuracy, precision, and the root mean square error (RMSE)—were used to validate and compare the predictive capabilities of the two models. Therefore, the area under the curve (AUC) values were 0.883 and 0.798, respectively, for the SMCE and SINMAP. In addition, the SMCE model produced the highest accuracy and precision values of 0.770 and 0.734, respectively. For the RMSE values, the SMCE produced better prediction than SINMAP (0.332 and 0.398, respectively). The overall comparison of results confirmed that both SINMAP and SMCE models are promising approaches for landslide risk prediction in central‐east Africa.  相似文献   

4.
Over the past decade, terrorism risk has become a prominent consideration in protecting the well‐being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all‐hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man‐made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event‐based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States.  相似文献   

5.
This article describes the development of a generic loss assessment methodology, which is applicable to earthquake and windstorm perils worldwide. The latest information regarding hazard estimation is first integrated with the parameters that best describe the intensity of the action of both windstorms and earthquakes on building structures, for events with defined average return periods or recurrence intervals. The subsequent evaluation of building vulnerability (damageability) under the action of both earthquake and windstorm loadings utilizes information on damage and loss from past events, along with an assessment of the key building properties (including age and quality of design and construction), to assess information about the ability of buildings to withstand such loadings and hence to assign a building type to the particular risk or portfolio of risks. This predicted damage information is then translated into risk-specific mathematical vulnerability functions, which enable numerical evaluation of the probability of building damage arising at various defined levels. By assigning cost factors to the defined damage levels, the associated computation of total loss at a given level of hazard may be achieved. This developed methodology is universal in the sense that it may be applied successfully to buildings situated in a variety of earthquake and windstorm environments, ranging from very low to extreme levels of hazard. As a loss prediction tool, it enables accurate estimation of losses from potential scenario events linked to defined return periods and, hence, can greatly assist risk assessment and planning.  相似文献   

6.
Modeling Uncertainties in Mining Pillar Stability Analysis   总被引:1,自引:0,他引:1  
Many countries are now facing problems related to their past mining activities. One of the greatest problems concerns the potential surface instability. In areas where a room-and-pillar extraction method was used, deterministic methodologies are generally used to assess the hazard of surface collapses. However, those methodologies suffer from not being able to take into account all the uncertainties inherent in any hazard analysis. Through the practical example of the assessment of a single pillar stability in a very simple mining layout, this article introduces a logical framework that can be used to incorporate the different kinds of uncertainties related to data and models, as well as to specific expert's choices in the hazard or risk analysis process. Practical recommendations and efficient tools are also provided to help engineers and experts in their daily work.  相似文献   

7.
In the general framework of quantitative methods for natural‐technological (NaTech) risk analysis, a specific methodology was developed for assessing risks caused by hazardous substances released due to earthquakes. The contribution of accidental scenarios initiated by seismic events to the overall industrial risk was assessed in three case studies derived from the actual plant layout of existing oil refineries. Several specific vulnerability models for different equipment classes were compared and assessed. The effect of differing structural resistances for process equipment on the final risk results was also investigated. The main factors influencing the final risk values resulted from the models for equipment vulnerability and the assumptions for the reference damage states of the process equipment. The analysis of case studies showed that in seismic zones the additional risk deriving from damage caused by earthquakes may be up to more than one order of magnitude higher than that associated to internal failure causes. Critical equipment was determined to be mainly pressurized tanks, even though atmospheric tanks were more vulnerable to containment loss. Failure of minor process equipment having a limited hold‐up of hazardous substances (such as pumps) was shown to have limited influence on the final values of the risk increase caused by earthquakes.  相似文献   

8.
In this study, a new approach of machine learning (ML) models integrated with the analytic hierarchy process (AHP) method was proposed to develop a holistic flood risk assessment map. Flood susceptibility maps were created using ML techniques. AHP was utilized to combine flood vulnerability and exposure criteria. We selected Quang Binh province of Vietnam as a case study and collected available data, including 696 flooding locations of historical flooding events in 2007, 2010, 2016, and 2020; and flood influencing factors of elevation, slope, curvature, flow direction, flow accumulation, distance from river, river density, land cover, geology, and rainfall. These data were used to construct training and testing datasets. The susceptibility models were validated and compared using statistical techniques. An integrated flood risk assessment framework was proposed to incorporate flood hazard (flood susceptibility), flood exposure (distance from river, land use, population density, and rainfall), and flood vulnerability (poverty rate, number of freshwater stations, road density, number of schools, and healthcare facilities). Model validation suggested that deep learning has the best performance of AUC = 0.984 compared with other ensemble models of MultiBoostAB Ensemble (0.958), Random SubSpace Ensemble (0.962), and credal decision tree (AUC = 0.918). The final flood risk map shows 5075 ha (0.63%) in extremely high risk, 47,955 ha (5.95%) in high-risk, 40,460 ha (5.02%) in medium risk, 431,908 ha (53.55%) in low risk areas, and 281,127 ha (34.86%) in very low risk. The present study highlights that the integration of ML models and AHP is a promising framework for mapping flood risks in flood-prone areas.  相似文献   

9.
In this article, the use of time series of satellite imagery to flood hazard mapping and flood risk assessment is presented. Flooded areas are extracted from satellite images for the flood‐prone territory, and a maximum flood extent image for each flood event is produced. These maps are further fused to determine relative frequency of inundation (RFI). The study shows that RFI values and relative water depth exhibit the same probabilistic distribution, which is confirmed by Kolmogorov‐Smirnov test. The produced RFI map can be used as a flood hazard map, especially in cases when flood modeling is complicated by lack of available data and high uncertainties. The derived RFI map is further used for flood risk assessment. Efficiency of the presented approach is demonstrated for the Katima Mulilo region (Namibia). A time series of Landsat‐5/7 satellite images acquired from 1989 to 2012 is processed to derive RFI map using the presented approach. The following direct damage categories are considered in the study for flood risk assessment: dwelling units, roads, health facilities, and schools. The produced flood risk map shows that the risk is distributed uniformly all over the region. The cities and villages with the highest risk are identified. The proposed approach has minimum data requirements, and RFI maps can be generated rapidly to assist rescuers and decisionmakers in case of emergencies. On the other hand, limitations include: strong dependence on the available data sets, and limitations in simulations with extrapolated water depth values.  相似文献   

10.
Ten years ago, the National Academy of Science released its risk assessment/risk management (RA/RM) “paradigm” that served to crystallize much of the early thinking about these concepts. By defining RA as a four-step process, operationally independent from RM, the paradigm has presented society with a scheme, or a conceptually common framework, for addressing many risky situations (e.g., carcinogens, noncarcinogens, and chemical mixtures). The procedure has facilitated decision-making in a wide variety of situations and has identified the most important research needs. The past decade, however, has revealed that additional progress is needed. These areas include addressing the appropriate interaction (not isolation) between RA and RM, improving the methods for assessing risks from mixtures, dealing with “adversity of effect,” deciding whether “hazard” should imply an exposure to environmental conditions or to laboratory conditions, and evolving the concept to include both health and ecological risk. Interest in and expectations of risk assessment are increasing rapidly. The emerging concept of “comparative risk” (i.e., distinguishing between large risks and smaller risks that may be qualitatively different) is at a level comparable to that held by the concept of “risk” just 10 years ago. Comparative risk stands in need of a paradigm of its own, especially given the current economic limitations. “Times are tough; Brother, can you paradigm?”  相似文献   

11.
Zhou Y  Liu M 《Risk analysis》2012,32(3):566-577
With the rapid development of industry in China, the number of establishments that are proposed or under construction is increasing year by year, and many are industries that handle flammable, explosive, toxic, harmful, and dangerous substances. Accidents such as fire, explosion, and toxic diffusion inevitably happen. Accidents resulting from these major hazards in cities cause a large number of casualties and property losses. It is increasingly important to analyze the risk of major hazards in cities realistically and to suitably plan and utilize the surrounding land based on the risk analysis results, thereby reducing the hazards. A theoretical system for risk assessment of major hazards in cities is proposed in this article, and the major hazard risk for the entire city is analyzed quantitatively. Risks of various major accidents are considered together, superposition effect is analyzed, individual risk contours of the entire city are drawn out, and the level of risk in the city is assessed using "as low as reasonably practicable" guidelines. After the entire city's individual risk distribution is obtained, risk zones are divided according to corresponding individual risk value of HSE, and land-use planning suggestions are proposed. Finally, a city in China is used as an example to illustrate the risk assessment process of the city's major hazard and its application in urban land-use planning. The proposed method has a certain theoretical and practical significance in establishing and improving risk analysis of major hazard and urban land-use planning. On the one hand, major urban public risk is avoided; further, the land is utilized in the best possible way in order to obtain the maximum benefit from its use.  相似文献   

12.
Land subsidence risk assessment (LSRA) is a multi‐attribute decision analysis (MADA) problem and is often characterized by both quantitative and qualitative attributes with various types of uncertainty. Therefore, the problem needs to be modeled and analyzed using methods that can handle uncertainty. In this article, we propose an integrated assessment model based on the evidential reasoning (ER) algorithm and fuzzy set theory. The assessment model is structured as a hierarchical framework that regards land subsidence risk as a composite of two key factors: hazard and vulnerability. These factors can be described by a set of basic indicators defined by assessment grades with attributes for transforming both numerical data and subjective judgments into a belief structure. The factor‐level attributes of hazard and vulnerability are combined using the ER algorithm, which is based on the information from a belief structure calculated by the Dempster‐Shafer (D‐S) theory, and a distributed fuzzy belief structure calculated by fuzzy set theory. The results from the combined algorithms yield distributed assessment grade matrices. The application of the model to the Xixi‐Chengnan area, China, illustrates its usefulness and validity for LSRA. The model utilizes a combination of all types of evidence, including all assessment information—quantitative or qualitative, complete or incomplete, and precise or imprecise—to provide assessment grades that define risk assessment on the basis of hazard and vulnerability. The results will enable risk managers to apply different risk prevention measures and mitigation planning based on the calculated risk states.  相似文献   

13.
Major accident risks posed by chemical hazards have raised major social concerns in today's China. Land‐use planning has been adopted by many countries as one of the essential elements for accident prevention. This article aims at proposing a method to assess major accident risks to support land‐use planning in the vicinity of chemical installations. This method is based on the definition of risk by the Accidental Risk Assessment Methodology for IndustrieS (ARAMIS) project and it is an expansion application of severity and vulnerability assessment tools. The severity and vulnerability indexes from the ARAMIS methodology are employed to assess both the severity and vulnerability levels, respectively. A risk matrix is devised to support risk ranking and compatibility checking. The method consists of four main steps and is presented in geographical information‐system‐based maps. As an illustration, the proposed method is applied in Dagushan Peninsula, China. The case study indicated that the method could not only aid risk regulations on existing land‐use planning, but also support future land‐use planning by offering alternatives or influencing the plans at the development stage, and thus further enhance the roles and influence of land‐use planning in the accident prevention activities in China.  相似文献   

14.
We examine whether the risk characterization estimated by catastrophic loss projection models is sensitive to the revelation of new information regarding risk type. We use commercial loss projection models from two widely employed modeling firms to estimate the expected hurricane losses of Florida Atlantic University's building stock, both including and excluding secondary information regarding hurricane mitigation features that influence damage vulnerability. We then compare the results of the models without and with this revealed information and find that the revelation of additional, secondary information influences modeled losses for the windstorm‐exposed university building stock, primarily evidenced by meaningful percent differences in the loss exceedance output indicated after secondary modifiers are incorporated in the analysis. Secondary risk characteristics for the data set studied appear to have substantially greater impact on probable maximum loss estimates than on average annual loss estimates. While it may be intuitively expected for catastrophe models to indicate that secondary risk characteristics hold value for reducing modeled losses, the finding that the primary value of secondary risk characteristics is in reduction of losses in the “tail” (low probability, high severity) events is less intuitive, and therefore especially interesting. Further, we address the benefit‐cost tradeoffs that commercial entities must consider when deciding whether to undergo the data collection necessary to include secondary information in modeling. Although we assert the long‐term benefit‐cost tradeoff is positive for virtually every entity, we acknowledge short‐term disincentives to such an effort.  相似文献   

15.
Risk Analysis for Critical Asset Protection   总被引:2,自引:0,他引:2  
This article proposes a quantitative risk assessment and management framework that supports strategic asset-level resource allocation decision making for critical infrastructure and key resource protection. The proposed framework consists of five phases: scenario identification, consequence and criticality assessment, security vulnerability assessment, threat likelihood assessment, and benefit-cost analysis. Key innovations in this methodology include its initial focus on fundamental asset characteristics to generate an exhaustive set of plausible threat scenarios based on a target susceptibility matrix (which we refer to as asset-driven analysis) and an approach to threat likelihood assessment that captures adversary tendencies to shift their preferences in response to security investments based on the expected utilities of alternative attack profiles assessed from the adversary perspective. A notional example is provided to demonstrate an application of the proposed framework. Extensions of this model to support strategic portfolio-level analysis and tactical risk analysis are suggested.  相似文献   

16.
An Approach to Vulnerability Analysis of Complex Industrial Systems   总被引:3,自引:0,他引:3  
Einarsson  Stefán  Rausand  Marvin 《Risk analysis》1998,18(5):535-546
The concept of vulnerability of complex industrial systems is defined and discussed in relation to risk and system survivability. The discussion is illustrated by referring to a number of previous industrial accidents. The various risk factors, or threats, influencing an industrial system's vulnerability are classified and discussed. Both internal and external threats are covered. The general scope of vulnerability analysis is compared to traditional risk analysis approaches and main differences are illustrated. A general procedure for vulnerability analysis in two steps, including building of scenarios and preparation of relevant worksheets, is described and discussed.  相似文献   

17.
A simple procedure is proposed in order to quantify the tradeoff between a loss suffered from an illness due to exposure to a microbial pathogen and a loss due to a toxic effect, perhaps a different illness, induced by a disinfectant employed to reduce the microbial exposure. Estimates of these two types of risk as a function of disinfectant dose and their associated relative losses provide information for the estimation of the optimum dose of disinfectant that minimizes the total expected loss. The estimates of the optimum dose and expected relative total loss were similar regardless of whether the beta-Poisson, log-logistic, or extreme value function was used to model the risk of illness due to exposure to a microbial pathogen. This is because the optimum dose of the disinfectant and resultant expected minimum loss depend upon the estimated slope (first derivative) of the models at low levels of risk, which appear to be similar for these three models at low levels of risk. Similarly, the choice among these three models does not appear critical for estimating the slope at low levels of risk for the toxic effect induced by the use of a disinfectant. For the proposed procedure to estimate the optimum disinfectant dose, it is not necessary to have absolute values for the losses due to microbial-induced or disinfectant-induced illness, but only relative losses are required. All aspects of the problem are amenable to sensitivity analyses. The issue of risk/benefit tradeoffs, more appropriately called risk/risk tradeoffs, does not appear to be an insurmountable problem.  相似文献   

18.
The development of catastrophe models in recent years allows for assessment of the flood hazard much more effectively than when the federally run National Flood Insurance Program (NFIP) was created in 1968. We propose and then demonstrate a methodological approach to determine pure premiums based on the entire distribution of possible flood events. We apply hazard, exposure, and vulnerability analyses to a sample of 300,000 single‐family residences in two counties in Texas (Travis and Galveston) using state‐of‐the‐art flood catastrophe models. Even in zones of similar flood risk classification by FEMA there is substantial variation in exposure between coastal and inland flood risk. For instance, homes in the designated moderate‐risk X500/B zones in Galveston are exposed to a flood risk on average 2.5 times greater than residences in X500/B zones in Travis. The results also show very similar average annual loss (corrected for exposure) for a number of residences despite their being in different FEMA flood zones. We also find significant storm‐surge exposure outside of the FEMA designated storm‐surge risk zones. Taken together these findings highlight the importance of a microanalysis of flood exposure. The process of aggregating risk at a flood zone level—as currently undertaken by FEMA—provides a false sense of uniformity. As our analysis indicates, the technology to delineate the flood risks exists today.  相似文献   

19.
《Risk analysis》2018,38(9):1921-1943
People's past experiences with a hazard theoretically influence how they approach future risks. Yet, past hazard experience has been conceptualized and measured in wide‐ranging, often simplistic, ways, resulting in mixed findings about its relationship with risk perception. This study develops a scale of past hazard experiences, in the context of tornadoes, that is content and construct valid. A conceptual definition was developed, a set of items were created to measure one's most memorable and multiple tornado experiences, and the measures were evaluated through two surveys of the public who reside in tornado‐prone areas. Four dimensions emerged of people's most memorable experience, reflecting their awareness of the tornado risk that day, their personalization of the risk, the intrusive impacts on them personally, and impacts experienced vicariously through others. Two dimensions emerged of people's multiple experiences, reflecting common types of communication received and negative emotional responses. These six dimensions are novel in that they capture people's experience across the timeline of a hazard as well as intangible experiences that are both direct and indirect. The six tornado experience dimensions were correlated with tornado risk perceptions measured as cognitive‐affective and as perceived probability of consequences. The varied experience–risk perception results suggest that it is important to understand the nuances of these concepts and their relationships. This study provides a foundation for future work to continue explicating past hazard experience, across different risk contexts, and for understanding its effect on risk assessment and responses.  相似文献   

20.
Sensitivity analysis (SA) methods are a valuable tool for identifying critical control points (CCPs), which is one of the important steps in the hazard analysis and CCP approach that is used to ensure safe food. There are many SA methods used across various disciplines. Furthermore, food safety process risk models pose challenges because they often are highly nonlinear, contain thresholds, and have discrete inputs. Therefore, it is useful to compare and evaluate SA methods based upon applications to an example food safety risk model. Ten SA methods were applied to a draft Vibrio parahaemolyticus (Vp) risk assessment model developed by the Food and Drug Administration. The model was modified so that all inputs were independent. Rankings of key inputs from different methods were compared. Inputs such as water temperature, number of oysters per meal, and the distributional assumption for the unrefrigerated time were the most important inputs, whereas time on water, fraction of pathogenic Vp, and the distributional assumption for the weight of oysters were the least important inputs. Most of the methods gave a similar ranking of key inputs even though the methods differed in terms of being graphical, mathematical, or statistical, accounting for individual effects or joint effect of inputs, and being model dependent or model independent. A key recommendation is that methods be further compared by application on different and more complex food safety models. Model independent methods, such as ANOVA, mutual information index, and scatter plots, are expected to be more robust than others evaluated.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号