共查询到20条相似文献,搜索用时 15 毫秒
1.
Preference Functions for Spatial Risk Analysis 总被引:1,自引:0,他引:1
When outcomes are defined over a geographic region, measures of spatial risk regarding these outcomes can be more complex than traditional measures of risk. One of the main challenges is the need for a cardinal preference function that incorporates the spatial nature of the outcomes. We explore preference conditions that will yield the existence of spatial measurable value and utility functions, and discuss their application to spatial risk analysis. We also present a simple example on household freshwater usage across regions to demonstrate how such functions can be assessed and applied. 相似文献
2.
In this study, a methodology has been proposed for risk analysis of dust explosion scenarios based on Bayesian network. Our methodology also benefits from a bow‐tie diagram to better represent the logical relationships existing among contributing factors and consequences of dust explosions. In this study, the risks of dust explosion scenarios are evaluated, taking into account common cause failures and dependencies among root events and possible consequences. Using a diagnostic analysis, dust particle properties, oxygen concentration, and safety training of staff are identified as the most critical root events leading to dust explosions. The probability adaptation concept is also used for sequential updating and thus learning from past dust explosion accidents, which is of great importance in dynamic risk assessment and management. We also apply the proposed methodology to a case study to model dust explosion scenarios, to estimate the envisaged risks, and to identify the vulnerable parts of the system that need additional safety measures. 相似文献
3.
In a quantitative model with uncertain inputs, the uncertainty of the output can be summarized by a risk measure. We propose a sensitivity analysis method based on derivatives of the output risk measure, in the direction of model inputs. This produces a global sensitivity measure, explicitly linking sensitivity and uncertainty analyses. We focus on the case of distortion risk measures, defined as weighted averages of output percentiles, and prove a representation of the sensitivity measure that can be evaluated on a Monte Carlo sample, as a weighted average of gradients over the input space. When the analytical model is unknown or hard to work with, nonparametric techniques are used for gradient estimation. This process is demonstrated through the example of a nonlinear insurance loss model. Furthermore, the proposed framework is extended in order to measure sensitivity to constant model parameters, uncertain statistical parameters, and random factors driving dependence between model inputs. 相似文献
4.
Ecological risk from the development of a wetland is assessed quantitatively by means of a new risk measure, expected loss of biodiversity (ELB). ELB is defined as the weighted sum of the increments in the probabilities of extinction of the species living in the wetland due to its loss. The weighting for a particular species is calculated according to the length of the branch on the phylogenetic tree that will be lost if the species becomes extinct. The length of the branch on the phylogenetic tree is regarded as reflecting the extent of contribution of the species to the taxonomic diversity of the world of living things. The increments in the probabilities of extinction are calculated by a simulation used for making the Red List for vascular plants in Japan. The resulting ELB for the loss of Nakaikemi wetland is 9,200 years. This result is combined with the economic costs for conservation of the wetland to produce a value for the indicator of the "cost per unit of biodiversity saved." Depending on the scenario, the value is 13,000 yen per year-ELB or 110,000 to 420,000 yen per year-ELB (1 US dollar = 110 yen in 1999). 相似文献
5.
In risk assessment, the moment‐independent sensitivity analysis (SA) technique for reducing the model uncertainty has attracted a great deal of attention from analysts and practitioners. It aims at measuring the relative importance of an individual input, or a set of inputs, in determining the uncertainty of model output by looking at the entire distribution range of model output. In this article, along the lines of Plischke et al., we point out that the original moment‐independent SA index (also called delta index) can also be interpreted as the dependence measure between model output and input variables, and introduce another moment‐independent SA index (called extended delta index) based on copula. Then, nonparametric methods for estimating the delta and extended delta indices are proposed. Both methods need only a set of samples to compute all the indices; thus, they conquer the problem of the “curse of dimensionality.” At last, an analytical test example, a risk assessment model, and the levelE model are employed for comparing the delta and the extended delta indices and testing the two calculation methods. Results show that the delta and the extended delta indices produce the same importance ranking in these three test examples. It is also shown that these two proposed calculation methods dramatically reduce the computational burden. 相似文献
6.
A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. 相似文献
7.
Domino effects are low‐probability high‐consequence accidents causing severe damage to humans, process plants, and the environment. Because domino effects affect large areas and are difficult to control, preventive safety measures have been given priority over mitigative measures. As a result, safety distances and safety inventories have been used as preventive safety measures to reduce the escalation probability of domino effects. However, these safety measures are usually designed considering static accident scenarios. In this study, we show that compared to a static worst‐case accident analysis, a dynamic consequence analysis provides a more rational approach for risk assessment and management of domino effects. This study also presents the application of Bayesian networks and conflict analysis to risk‐based allocation of chemical inventories to minimize the consequences and thus to reduce the escalation probability. It emphasizes the risk management of chemical inventories as an inherent safety measure, particularly in existing process plants where the applicability of other safety measures such as safety distances is limited. 相似文献
8.
Safety systems are important components of high-consequence systems that are intended to prevent the unintended operation of the system and thus the potentially significant negative consequences that could result from such an operation. This presentation investigates and illustrates formal procedures for assessing the uncertainty in the probability that a safety system will fail to operate as intended in an accident environment. Probability theory and evidence theory are introduced as possible mathematical structures for the representation of the epistemic uncertainty associated with the performance of safety systems, and a representation of this type is illustrated with a hypothetical safety system involving one weak link and one strong link that is exposed to a high temperature fire environment. Topics considered include (1) the nature of diffuse uncertainty information involving a system and its environment, (2) the conversion of diffuse uncertainty information into the mathematical structures associated with probability theory and evidence theory, and (3) the propagation of these uncertainty structures through a model for a safety system to obtain representations in the context of probability theory and evidence theory of the uncertainty in the probability that the safety system will fail to operate as intended. The results suggest that evidence theory provides a potentially valuable representational tool for the display of the implications of significant epistemic uncertainty in inputs to complex analyses. 相似文献
9.
Willful attacks or natural disasters pose extreme risks to sectors of the economy. An extreme-event analysis extension is proposed for the Inoperability Input-Output Model (IIM) and the Dynamic IIM (DIIM), which are analytical methodologies for assessing the propagated consequences of initial disruptions to a set of sectors. The article discusses two major risk categories that the economy typically experiences following extreme events: (i) significant changes in consumption patterns due to lingering public fear and (ii) adjustments to the production outputs of the interdependent economic sectors that are necessary to match prevailing consumption levels during the recovery period. Probability distributions associated with changes in the consumption of directly affected sectors are generated based on trends, forecasts, and expert evidence to assess the expected losses of the economy. Analytical formulations are derived to quantify the extreme risks associated with a set of initially affected sectors. In addition, Monte Carlo simulation is used to handle the more complex calculations required for a larger set of sectors and general types of probability distributions. A two-sector example is provided at the end of the article to illustrate the proposed extreme risk model formulations. 相似文献
10.
Sensitivity Analysis of a Two-Dimensional Probabilistic Risk Assessment Model Using Analysis of Variance 总被引:1,自引:0,他引:1
This article demonstrates application of sensitivity analysis to risk assessment models with two-dimensional probabilistic frameworks that distinguish between variability and uncertainty. A microbial food safety process risk (MFSPR) model is used as a test bed. The process of identifying key controllable inputs and key sources of uncertainty using sensitivity analysis is challenged by typical characteristics of MFSPR models such as nonlinearity, thresholds, interactions, and categorical inputs. Among many available sensitivity analysis methods, analysis of variance (ANOVA) is evaluated in comparison to commonly used methods based on correlation coefficients. In a two-dimensional risk model, the identification of key controllable inputs that can be prioritized with respect to risk management is confounded by uncertainty. However, as shown here, ANOVA provided robust insights regarding controllable inputs most likely to lead to effective risk reduction despite uncertainty. ANOVA appropriately selected the top six important inputs, while correlation-based methods provided misleading insights. Bootstrap simulation is used to quantify uncertainty in ranks of inputs due to sampling error. For the selected sample size, differences in F values of 60% or more were associated with clear differences in rank order between inputs. Sensitivity analysis results identified inputs related to the storage of ground beef servings at home as the most important. Risk management recommendations are suggested in the form of a consumer advisory for better handling and storage practices. 相似文献
11.
Johnson Holt Adrian W. Leach Gritta Schrader Françoise Petter Alan MacLeod Dirk Jan van der Gaag Richard H. A. Baker John D. Mumford 《Risk analysis》2014,34(1):4-16
Utility functions in the form of tables or matrices have often been used to combine discretely rated decision‐making criteria. Matrix elements are usually specified individually, so no one rule or principle can be easily stated for the utility function as a whole. A series of five matrices are presented that aggregate criteria two at a time using simple rules that express a varying degree of constraint of the lower rating over the higher. A further nine possible matrices were obtained by using a different rule either side of the main axis of the matrix to describe situations where the criteria have a differential influence on the outcome. Uncertainties in the criteria are represented by three alternative frequency distributions from which the assessors select the most appropriate. The output of the utility function is a distribution of rating frequencies that is dependent on the distributions of the input criteria. In pest risk analysis (PRA), seven of these utility functions were required to mimic the logic by which assessors for the European and Mediterranean Plant Protection Organization arrive at an overall rating of pest risk. The framework enables the development of PRAs that are consistent and easy to understand, criticize, compare, and change. When tested in workshops, PRA practitioners thought that the approach accorded with both the logic and the level of resolution that they used in the risk assessments. 相似文献
12.
Using Risk Assessment, Benefit-Cost Analysis, and Real Options to Implement a Precautionary Principle 总被引:1,自引:0,他引:1
Scott Farrow 《Risk analysis》2004,24(3):727-735
13.
Cost‐benefit analysis (CBA) is commonly applied as a tool for deciding on risk protection. With CBA, one can identify risk mitigation strategies that lead to an optimal tradeoff between the costs of the mitigation measures and the achieved risk reduction. In practical applications of CBA, the strategies are typically evaluated through efficiency indicators such as the benefit‐cost ratio (BCR) and the marginal cost (MC) criterion. In many of these applications, the BCR is not consistently defined, which, as we demonstrate in this article, can lead to the identification of suboptimal solutions. This is of particular relevance when the overall budget for risk reduction measures is limited and an optimal allocation of resources among different subsystems is necessary. We show that this problem can be formulated as a hierarchical decision problem, where the general rules and decisions on the available budget are made at a central level (e.g., central government agency, top management), whereas the decisions on the specific measures are made at the subsystem level (e.g., local communities, company division). It is shown that the MC criterion provides optimal solutions in such hierarchical optimization. Since most practical applications only include a discrete set of possible risk protection measures, the MC criterion is extended to this situation. The findings are illustrated through a hypothetical numerical example. This study was prepared as part of our work on the optimal management of natural hazard risks, but its conclusions also apply to other fields of risk management. 相似文献
14.
This article proposes, develops, and illustrates the application of level‐k game theory to adversarial risk analysis. Level‐k reasoning, which assumes that players play strategically but have bounded rationality, is useful for operationalizing a Bayesian approach to adversarial risk analysis. It can be applied in a broad class of settings, including settings with asynchronous play and partial but incomplete revelation of early moves. Its computational and elicitation requirements are modest. We illustrate the approach with an application to a simple defend‐attack model in which the defender's countermeasures are revealed with a probability less than one to the attacker before he decides on how or whether to attack. 相似文献
15.
This article presents a framework for using probabilistic terrorism risk modeling in regulatory analysis. We demonstrate the framework with an example application involving a regulation under consideration, the Western Hemisphere Travel Initiative for the Land Environment, (WHTI‐L). First, we estimate annualized loss from terrorist attacks with the Risk Management Solutions (RMS) Probabilistic Terrorism Model. We then estimate the critical risk reduction, which is the risk‐reducing effectiveness of WHTI‐L needed for its benefit, in terms of reduced terrorism loss in the United States, to exceed its cost. Our analysis indicates that the critical risk reduction depends strongly not only on uncertainties in the terrorism risk level, but also on uncertainty in the cost of regulation and how casualties are monetized. For a terrorism risk level based on the RMS standard risk estimate, the baseline regulatory cost estimate for WHTI‐L, and a range of casualty cost estimates based on the willingness‐to‐pay approach, our estimate for the expected annualized loss from terrorism ranges from $2.7 billion to $5.2 billion. For this range in annualized loss, the critical risk reduction for WHTI‐L ranges from 7% to 13%. Basing results on a lower risk level that results in halving the annualized terrorism loss would double the critical risk reduction (14–26%), and basing the results on a higher risk level that results in a doubling of the annualized terrorism loss would cut the critical risk reduction in half (3.5–6.6%). Ideally, decisions about terrorism security regulations and policies would be informed by true benefit‐cost analyses in which the estimated benefits are compared to costs. Such analyses for terrorism security efforts face substantial impediments stemming from the great uncertainty in the terrorist threat and the very low recurrence interval for large attacks. Several approaches can be used to estimate how a terrorism security program or regulation reduces the distribution of risks it is intended to manage. But, continued research to develop additional tools and data is necessary to support application of these approaches. These include refinement of models and simulations, engagement of subject matter experts, implementation of program evaluation, and estimating the costs of casualties from terrorism events. 相似文献
16.
Fermín Mallor Carmen García-Olaverri Sagrario Gómez-Elvira Pedro Mateo-Collazas 《Risk analysis》2008,28(4):1003-1019
In this article, we present a methodology to assess the risk incurred by a participant in an activity involving danger of injury. The lack of high-quality historical data for the case considered prevented us from constructing a sufficiently detailed statistical model. It was therefore decided to generate a risk assessment model based on expert judgment. The methodology is illustrated in a real case context: the assessment of risk to participants in a San Fermin bull-run in Pamplona (Spain). The members of the panel of experts on the bull-run represented very different perspectives on the phenomenon: runners, surgeons and other health care personnel, journalists, civil defense workers, security staff, organizers, herdsmen, authors of books on the bull-run, etc. We consulted 55 experts. Our methodology includes the design of a survey instrument to elicit the experts' views and the statistical and mathematical procedures used to aggregate their subjective opinions. 相似文献
17.
Probabilistic seismic risk analysis is a well‐established method in the insurance industry for modeling portfolio losses from earthquake events. In this context, precise exposure locations are often unknown. However, so far, location uncertainty has not been in the focus of a large amount of research. In this article, we propose a novel framework for treatment of location uncertainty. As a case study, a large number of synthetic portfolios resembling typical real‐world cases were created. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on the variability of loss frequency estimations. The results indicate that due to loss aggregation effects and spatial hazard variability, location uncertainty in isolation and in conjunction with ground motion uncertainty can induce significant variability to probabilistic loss results, especially for portfolios with a small number of risks. After quantifying its effect, we conclude that location uncertainty should not be neglected when assessing probabilistic seismic risk, but should be treated stochastically and the resulting variability should be visualized and interpreted carefully. 相似文献
18.
19.
Influenza pandemic is a serious disaster that can pose significant disruptions to the workforce and associated economic sectors. This article examines the impact of influenza pandemic on workforce availability within an interdependent set of economic sectors. We introduce a simulation model based on the dynamic input‐output model to capture the propagation of pandemic consequences through the National Capital Region (NCR). The analysis conducted in this article is based on the 2009 H1N1 pandemic data. Two metrics were used to assess the impacts of the influenza pandemic on the economic sectors: (i) inoperability, which measures the percentage gap between the as‐planned output and the actual output of a sector, and (ii) economic loss, which quantifies the associated monetary value of the degraded output. The inoperability and economic loss metrics generate two different rankings of the critical economic sectors. Results show that most of the critical sectors in terms of inoperability are sectors that are related to hospitals and health‐care providers. On the other hand, most of the sectors that are critically ranked in terms of economic loss are sectors with significant total production outputs in the NCR such as federal government agencies. Therefore, policy recommendations relating to potential mitigation and recovery strategies should take into account the balance between the inoperability and economic loss metrics. 相似文献
20.
William J. Cronin IV Eric J. Oswald Michael L. Shelley Jeffrey W. Fisher Carlyle D. Flemming 《Risk analysis》1995,15(5):555-565
A Monte Carlo simulation is incorporated into a risk assessment for trichloroethylene (TCE) using physiologically-based pharmacokinetic (PBPK) modeling coupled with the linearized multistage model to derive human carcinogenic risk extrapolations. The Monte Carlo technique incorporates physiological parameter variability to produce a statistically derived range of risk estimates which quantifies specific uncertainties associated with PBPK risk assessment approaches. Both inhalation and ingestion exposure routes are addressed. Simulated exposure scenarios were consistent with those used by the Environmental Protection Agency (EPA) in their TCE risk assessment. Mean values of physiological parameters were gathered from the literature for both mice (carcinogenic bioassay subjects) and for humans. Realistic physiological value distributions were assumed using existing data on variability. Mouse cancer bioassay data were correlated to total TCE metabolized and area-under-the-curve (blood concentration) trichloroacetic acid (TCA) as determined by a mouse PBPK model. These internal dose metrics were used in a linearized multistage model analysis to determine dose metric values corresponding to 10-6 lifetime excess cancer risk. Using a human PBPK model, these metabolized doses were then extrapolated to equivalent human exposures (inhalation and ingestion). The Monte Carlo iterations with varying mouse and human physiological parameters produced a range of human exposure concentrations producing a 10-6 risk. 相似文献