首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Losses due to natural hazard events can be extraordinarily high and difficult to cope with. Therefore, there is considerable interest to estimate the potential impact of current and future extreme events at all scales in as much detail as possible. As hazards typically spread over wider areas, risk assessment must take into account interrelations between regions. Neglecting such interdependencies can lead to a severe underestimation of potential losses, especially for extreme events. This underestimation of extreme risk can lead to the failure of riskmanagement strategies when they are most needed, namely, in times of unprecedented events. In this article, we suggest a methodology to incorporate such interdependencies in risk via the use of copulas. We demonstrate that by coupling losses, dependencies can be incorporated in risk analysis, avoiding the underestimation of risk. Based on maximum discharge data of river basins and stream networks, we present and discuss different ways to couple loss distributions of basins while explicitly incorporating tail dependencies. We distinguish between coupling methods that require river structure data for the analysis and those that do not. For the later approach we propose a minimax algorithm to choose coupled basin pairs so that the underestimation of risk is avoided and the use of river structure data is not needed. The proposed methodology is especially useful for large‐scale analysis and we motivate and apply our method using the case of Romania. The approach can be easily extended to other countries and natural hazards.  相似文献   

2.
The domain of risk analysis is expanded to consider strategic interactions among multiple participants in the management of extreme risk in a system of systems. These risks are fraught with complexity, ambiguity, and uncertainty, which pose challenges in how participants perceive, understand, and manage risk of extreme events. In the case of extreme events affecting a system of systems, cause‐and‐effect relationships among initiating events and losses may be difficult to ascertain due to interactions of multiple systems and participants (complexity). Moreover, selection of threats, hazards, and consequences on which to focus may be unclear or contentious to participants within multiple interacting systems (ambiguity). Finally, all types of risk, by definition, involve potential losses due to uncertain events (uncertainty). Therefore, risk analysis of extreme events affecting a system of systems should address complex, ambiguous, and uncertain aspects of extreme risk. To accomplish this, a system of systems engineering methodology for risk analysis is proposed as a general approach to address extreme risk in a system of systems. Our contribution is an integrative and adaptive systems methodology to analyze risk such that strategic interactions among multiple participants are considered. A practical application of the system of systems engineering methodology is demonstrated in part by a case study of a maritime infrastructure system of systems interface, namely, the Straits of Malacca and Singapore.  相似文献   

3.
Many attempts are made to assess future changes in extreme weather events due to anthropogenic climate change, but few studies have estimated the potential change in economic losses from such events. Projecting losses is more complex as it requires insight into the change in the weather hazard but also into exposure and vulnerability of assets. This article discusses the issues involved as well as a framework for projecting future losses, and provides an overview of some state‐of‐the‐art projections. Estimates of changes in losses from cyclones and floods are given, and particular attention is paid to the different approaches and assumptions. All projections show increases in extreme weather losses due to climate change. Flood losses are generally projected to increase more rapidly than losses from tropical and extra‐tropical cyclones. However, for the period until the year 2040, the contribution from increasing exposure and value of capital at risk to future losses is likely to be equal or larger than the contribution from anthropogenic climate change. Given the fact that the occurrence of loss events also varies over time due to natural climate variability, the signal from anthropogenic climate change is likely to be lost among the other causes for changes in risk, at least during the period until 2040. More efforts are needed to arrive at a comprehensive approach that includes quantification of changes in hazard, exposure, and vulnerability, as well as adaptation effects.  相似文献   

4.
Disruptive events such as natural disasters, loss or reduction of resources, work stoppages, and emergent conditions have potential to propagate economic losses across trade networks. In particular, disruptions to the operation of container port activity can be detrimental for international trade and commerce. Risk assessment should anticipate the impact of port operation disruptions with consideration of how priorities change due to uncertain scenarios and guide investments that are effective and feasible for implementation. Priorities for protective measures and continuity of operations planning must consider the economic impact of such disruptions across a variety of scenarios. This article introduces new performance metrics to characterize resiliency in interdependency modeling and also integrates scenario‐based methods to measure economic sensitivity to sudden‐onset disruptions. The methods will be demonstrated on a U.S. port responsible for handling $36.1 billion of cargo annually. The methods will be useful to port management, private industry supply chain planning, and transportation infrastructure management.  相似文献   

5.
Due to the concentration of assets in disaster‐prone zones, changes in risk landscape and in the intensity of natural events, property losses have increased considerably in recent decades. While measuring these stock damages is common practice in the literature, the assessment of economic ripple effects due to business interruption is still limited and available estimates tend to vary significantly across models. This article focuses on the most popular single‐region input–output models for disaster impact evaluation. It starts with the traditional Leontief model and then compares its assumptions and results with more complex methodologies (rebalancing algorithms, the sequential interindustry model, the dynamic inoperability input–output model, and its inventory counterpart). While the estimated losses vary across models, all the figures are based on the same event, the 2007 Chehalis River flood that impacted three rural counties in Washington State. Given that the large majority of floods take place in rural areas, this article gives the practitioner a thorough review of how future events can be assessed and guidance on model selection.  相似文献   

6.
Howard Kunreuther 《Risk analysis》2020,40(Z1):2263-2271
In honor of the 40th anniversary of Risk Analysis, this article suggests ways of linking risk assessment and risk perception in developing risk management strategies that have a good chance of being implemented, focusing on the problem of reducing losses from natural hazards in the face of climate change. Following a checklist for developing an implementable risk management strategy, Section 2 highlights the impact that exponential growth of CO2 emissions is likely to have on future disaster losses as assessed by climate and social scientists. Section 3 then discusses how people perceive the risks of low-probability adverse events and the cognitive biases that lead them to underprepare for future losses. Based on this empirical evidence, Section 4 proposes a risk management strategy for reducing future losses using the principles of choice architecture to communicate the likelihood and consequences of disasters, coupled with economic incentives and well-enforced regulations.  相似文献   

7.
As recent events have shown, simultaneous crop losses in different parts of the world can cause serious risks to global food security. However, to date, little is known about the spatial dependency of lower than expected crop yields from global breadbaskets. This especially applies in the case of extreme events, i.e., where one or more breadbaskets are experiencing far below average yields. Without such information, risk management approaches cannot be applied and vulnerability to extremes may remain high or even increase in the future around the world. We tackle both issues from an empirical perspective focusing on wheat yield. Interdependencies between historically observed wheat yield deviations in five breadbaskets (United States, Argentina, India, China, and Australia) are estimated via copula approaches that can incorporate increasing tail dependencies. In doing so, we are able to attach probabilities to interregional as well as global yield losses. To address the robustness of our results, we apply three different methods for constructing multivariate copulas: vine copulas, ordered coupling using a minimax approach, and hierarchical structuring. We found interdependencies between states within breadbaskets that led us to the conclusion that risk pooling for extremes is less favorable on the regional level. However, notwithstanding evidence of global climatic teleconnections that may influence crop production, we also demonstrate empirically that wheat production losses are independent between global breadbaskets, which strengthens the case for interregional risk pooling strategies. We argue that through interregional risk pooling, postdisaster liabilities of governments and international donors could be decreased.  相似文献   

8.
On Modeling Correlated Random Variables in Risk Assessment   总被引:1,自引:0,他引:1  
Haas  Charles N. 《Risk analysis》1999,19(6):1205-1214
Monte Carlo methods in risk assessment are finding increasingly widespread application. With the recognition that inputs may be correlated, the incorporation of such correlations into the simulation has become important. Most implementations rely upon the method of Iman and Conover for generating correlated random variables. In this work, alternative methods using copulas are presented for deriving correlated random variables. It is further shown that the particular algorithm or assumption used may have a substantial effect on the output results, due to differences in higher order bivariate moments.  相似文献   

9.
This article describes the development of a generic loss assessment methodology, which is applicable to earthquake and windstorm perils worldwide. The latest information regarding hazard estimation is first integrated with the parameters that best describe the intensity of the action of both windstorms and earthquakes on building structures, for events with defined average return periods or recurrence intervals. The subsequent evaluation of building vulnerability (damageability) under the action of both earthquake and windstorm loadings utilizes information on damage and loss from past events, along with an assessment of the key building properties (including age and quality of design and construction), to assess information about the ability of buildings to withstand such loadings and hence to assign a building type to the particular risk or portfolio of risks. This predicted damage information is then translated into risk-specific mathematical vulnerability functions, which enable numerical evaluation of the probability of building damage arising at various defined levels. By assigning cost factors to the defined damage levels, the associated computation of total loss at a given level of hazard may be achieved. This developed methodology is universal in the sense that it may be applied successfully to buildings situated in a variety of earthquake and windstorm environments, ranging from very low to extreme levels of hazard. As a loss prediction tool, it enables accurate estimation of losses from potential scenario events linked to defined return periods and, hence, can greatly assist risk assessment and planning.  相似文献   

10.
This article addresses the application of ecological risk assessment at the regional scale to the prediction of impacts due to invasive or nonindigenous species (NIS). The first section describes risk assessment, the decision-making process, and introduces regional risk assessment. A general conceptual model for the risk assessment of NIS is then presented based upon the regional risk assessment approach. Two diverse examples of the application of this approach are presented. The first example is based upon the dynamics of introduced plasmids into bacteria populations. The second example is the application risk assessment approach to the invasion of a coastal marine site of Cherry Point, Washington, USA by the European green crab. The lessons learned from the two examples demonstrate that assessment of the risks of invasion of NIS will have to incorporate not only the characteristics of the invasive species, but also the other stresses and impacts affecting the region of interest.  相似文献   

11.
《Risk analysis》2018,38(10):2073-2086
The guidelines for setting environmental quality standards are increasingly based on probabilistic risk assessment due to a growing general awareness of the need for probabilistic procedures. One of the commonly used tools in probabilistic risk assessment is the species sensitivity distribution (SSD), which represents the proportion of species affected belonging to a biological assemblage as a function of exposure to a specific toxicant. Our focus is on the inverse use of the SSD curve with the aim of estimating the concentration, HCp, of a toxic compound that is hazardous to p% of the biological community under study. Toward this end, we propose the use of robust statistical methods in order to take into account the presence of outliers or apparent skew in the data, which may occur without any ecological basis. A robust approach exploits the full neighborhood of a parametric model, enabling the analyst to account for the typical real‐world deviations from ideal models. We examine two classic HCp estimation approaches and consider robust versions of these estimators. In addition, we also use data transformations in conjunction with robust estimation methods in case of heteroscedasticity. Different scenarios using real data sets as well as simulated data are presented in order to illustrate and compare the proposed approaches. These scenarios illustrate that the use of robust estimation methods enhances HCp estimation.  相似文献   

12.
Public perception of risk is being cited as a documented reason to rethink a very contentious congressionally mandated process for siting interim storage and permanent disposal facilities for high-level radioactive waste. Rigorous survey research has shown that the public holds intense, negative images of "nuclear" and "radioactive" technologies, activities, and facilities. Potential host states and opponents claim that these negative images, coupled with an amplification of negative risk events, will potentially stigmatize the area surrounding such facilities and result in significant economic losses. At issue is whether a supporting social amplification of risk model is applicable to communities hosting facilities that are part of the U.S. Department of Energy Nuclear Weapons Complex. An initial assessment of high-profile discrete and cumulative key negative risk events at such nuclear facilities does not validate that there has been stigmatization or substantial social and economic consequences in the host areas. Before any changes to major national policy are implemented, additional research is required to determine if the nearby public's "pragmatic logic," based on practical knowledge and experience, attenuates the link between public opinion and demographic and economic behaviors.  相似文献   

13.
A simple procedure is proposed in order to quantify the tradeoff between a loss suffered from an illness due to exposure to a microbial pathogen and a loss due to a toxic effect, perhaps a different illness, induced by a disinfectant employed to reduce the microbial exposure. Estimates of these two types of risk as a function of disinfectant dose and their associated relative losses provide information for the estimation of the optimum dose of disinfectant that minimizes the total expected loss. The estimates of the optimum dose and expected relative total loss were similar regardless of whether the beta-Poisson, log-logistic, or extreme value function was used to model the risk of illness due to exposure to a microbial pathogen. This is because the optimum dose of the disinfectant and resultant expected minimum loss depend upon the estimated slope (first derivative) of the models at low levels of risk, which appear to be similar for these three models at low levels of risk. Similarly, the choice among these three models does not appear critical for estimating the slope at low levels of risk for the toxic effect induced by the use of a disinfectant. For the proposed procedure to estimate the optimum disinfectant dose, it is not necessary to have absolute values for the losses due to microbial-induced or disinfectant-induced illness, but only relative losses are required. All aspects of the problem are amenable to sensitivity analyses. The issue of risk/benefit tradeoffs, more appropriately called risk/risk tradeoffs, does not appear to be an insurmountable problem.  相似文献   

14.
For several years machine learning methods have been proposed for risk classification. While machine learning methods have also been used for failure diagnosis and condition monitoring, to the best of our knowledge, these methods have not been used for probabilistic risk assessment. Probabilistic risk assessment is a subjective process. The problem of how well machine learning methods can emulate expert judgments is challenging. Expert judgments are based on mental shortcuts, heuristics, which are susceptible to biases. This paper presents a process for developing natural language-based probabilistic risk assessment models, applying deep learning algorithms to emulate experts’ quantified risk estimates. This allows the risk analyst to obtain an a priori risk assessment when there is limited information in the form of text and numeric data. Universal sentence embedding (USE) with gradient boosting regression (GBR) trees trained over limited structured data presented the most promising results. When we apply these models’ outputs to generate survival distributions for autonomous systems’ likelihood of loss with distance, we observe that for open water and ice shelf operating environments, the differences between the survival distributions generated by the machine learning algorithm and those generated by the experts are not statistically significant.  相似文献   

15.
Currently, there is a trend away from the use of single (often conservative) estimates of risk to summarize the results of risk analyses in favor of stochastic methods which provide a more complete characterization of risk. The use of such stochastic methods leads to a distribution of possible values of risk, taking into account both uncertainty and variability in all of the factors affecting risk. In this article, we propose a general framework for the analysis of uncertainty and variability for use in the commonly encountered case of multiplicative risk models, in which risk may be expressed as a product of two or more risk factors. Our analytical methods facilitate the evaluation of overall uncertainty and variability in risk assessment, as well as the contributions of individual risk factors to both uncertainty and variability which is cumbersome using Monte Carlo methods. The use of these methods is illustrated in the analysis of potential cancer risks due to the ingestion of radon in drinking water.  相似文献   

16.
In this article, we propose an integrated direct and indirect flood risk model for small‐ and large‐scale flood events, allowing for dynamic modeling of total economic losses from a flood event to a full economic recovery. A novel approach is taken that translates direct losses of both capital and labor into production losses using the Cobb‐Douglas production function, aiming at improved consistency in loss accounting. The recovery of the economy is modeled using a hybrid input‐output model and applied to the port region of Rotterdam, using six different flood events (1/10 up to 1/10,000). This procedure allows gaining a better insight regarding the consequences of both high‐ and low‐probability floods. The results show that in terms of expected annual damage, direct losses remain more substantial relative to the indirect losses (approximately 50% larger), but for low‐probability events the indirect losses outweigh the direct losses. Furthermore, we explored parameter uncertainty using a global sensitivity analysis, and varied critical assumptions in the modeling framework related to, among others, flood duration and labor recovery, using a scenario approach. Our findings have two important implications for disaster modelers and practitioners. First, high‐probability events are qualitatively different from low‐probability events in terms of the scale of damages and full recovery period. Second, there are substantial differences in parameter influence between high‐probability and low‐probability flood modeling. These findings suggest that a detailed approach is required when assessing the flood risk for a specific region.  相似文献   

17.
This article focuses on conceptual and methodological developments allowing the integration of physical and social dynamics leading to model forecasts of circumstance‐specific human losses during a flash flood. To reach this objective, a random forest classifier is applied to assess the likelihood of fatality occurrence for a given circumstance as a function of representative indicators. Here, vehicle‐related circumstance is chosen as the literature indicates that most fatalities from flash flooding fall in this category. A database of flash flood events, with and without human losses from 2001 to 2011 in the United States, is supplemented with other variables describing the storm event, the spatial distribution of the sensitive characteristics of the exposed population, and built environment at the county level. The catastrophic flash floods of May 2015 in the states of Texas and Oklahoma are used as a case study to map the dynamics of the estimated probabilistic human risk on a daily scale. The results indicate the importance of time‐ and space‐dependent human vulnerability and risk assessment for short‐fuse flood events. The need for more systematic human impact data collection is also highlighted to advance impact‐based predictive models for flash flood casualties using machine‐learning approaches in the future.  相似文献   

18.
19.
A Survey of Approaches for Assessing and Managing the Risk of Extremes   总被引:8,自引:0,他引:8  
In this paper, we review methods for assessing and managing the risk of extreme events, where extreme events are defined to be rare, severe, and outside the normal range of experience of the system in question. First, we discuss several systematic approaches for identifying possible extreme events. We then discuss some issues related to risk assessment of extreme events, including what type of output is needed (e.g., a single probability vs. a probability distribution), and alternatives to the probabilistic approach. Next, we present a number of probabilistic methods. These include: guidelines for eliciting informative probability distributions from experts; maximum entropy distributions; extreme value theory; other approaches for constructing prior distributions (such as reference or noninformative priors); the use of modeling and decomposition to estimate the probability (or distribution) of interest; and bounding methods. Finally, we briefly discuss several approaches for managing the risk of extreme events, and conclude with recommendations and directions for future research.  相似文献   

20.
After the Seveso disaster occurred more than 40 years ago, there has been an increasing awareness of the potential impacts that similar accident events can occur in a wide range of process establishments, where the handling and production of hazardous substances pose a real threat to society and the environment. In these industrial sites denominated “Seveso sites,” the urgent need for an effective strategy emerged markedly to handle hazardous activities and to ensure safe conditions. Since then, the main challenging research issues have focused on how to prevent such accident events and how to mitigate their consequences leading to the development of many risk assessment methodologies. In recent years, researchers and practitioners have tried to provide useful overviews of the existing risk assessment methodologies proposing several reviews. However, these reviews are not exhaustive because they are either dated or focus only on one specific topic (e.g., liquefied natural gas, domino effect, etc.). This work aims to overcome the limitations of the current reviews by providing an up-to-date and comprehensive overview of the risk assessment methodologies for handling hazardous substances within the European industry. In particular, we have focused on the current techniques for hazards and accident scenarios identification, as well as probability and consequence analyses for both onshore and offshore installations. Thus, we have identified the research streams that have characterized the activities of researchers and practitioners over the years, and we have then presented and discussed the different risk assessment methodologies available concerning the research stream that they belong to.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号