首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Worldwide data on terrorist incidents between 1968 and 2004 gathered by the RAND Corporation and the Oklahoma City National Memorial Institute for the Prevention of Terrorism (MIPT) were assessed for patterns and trends in morbidity/mortality. Adjusted data analyzed involve a total of 19,828 events, 7,401 "adverse" events (each causing >or= 1 victim), and 86,568 "casualties" (injuries), of which 25,408 were fatal. Most terror-related adverse events, casualties, and deaths involved bombs and guns. Weapon-specific patterns and terror-related risk levels in Israel (IS) have differed markedly from those of all other regions combined (OR). IS had a fatal fraction of casualties about half that of OR, but has experienced relatively constant lifetime terror-related casualty risks on the order of 0.5%--a level 2 to 3 orders of magnitude more than those experienced in OR that increased approximately 100-fold over the same period. Individual event fatality has increased steadily, the median increasing from 14% to 50%. Lorenz curves obtained indicate substantial dispersion among victim/event rates: about half of all victims were caused by the top 2.5% (or 10%) of harm-ranked events in OR (or IS). Extreme values of victim/event rates were approximated fairly well by generalized Pareto models (typically used to fit to data on forest fires, sea levels, earthquakes, etc.). These results were in turn used to forecast maximum OR- and IS-specific victims/event rates through 2080, illustrating empirically-based methods that could be applied to improve strategies to assess, prevent, and manage terror-related risks and consequences.  相似文献   

2.
Matthew Revie 《Risk analysis》2011,31(7):1120-1132
Traditional statistical procedures for estimating the probability of an event result in an estimate of zero when no events are realized. Alternative inferential procedures have been proposed for the situation where zero events have been realized but often these are ad hoc, relying on selecting methods dependent on the data that have been realized. Such data‐dependent inference decisions violate fundamental statistical principles, resulting in estimation procedures whose benefits are difficult to assess. In this article, we propose estimating the probability of an event occurring through minimax inference on the probability that future samples of equal size realize no more events than that in the data on which the inference is based. Although motivated by inference on rare events, the method is not restricted to zero event data and closely approximates the maximum likelihood estimate (MLE) for nonzero data. The use of the minimax procedure provides a risk adverse inferential procedure where there are no events realized. A comparison is made with the MLE and regions of the underlying probability are identified where this approach is superior. Moreover, a comparison is made with three standard approaches to supporting inference where no event data are realized, which we argue are unduly pessimistic. We show that for situations of zero events the estimator can be simply approximated with , where n is the number of trials.  相似文献   

3.
The Constrained Extremal Distribution Selection Method   总被引:5,自引:0,他引:5  
Engineering design and policy formulation often involve the assessment of the likelihood of future events commonly expressed through a probability distribution. Determination of these distributions is based, when possible, on observational data. Unfortunately, these data are often incomplete, biased, and/or incorrect. These problems are exacerbated when policy formulation involves the risk of extreme events—situations of low likelihood and high consequences. Usually, observational data simply do not exist for such events. Therefore, determination of probabilities which characterize extreme events must utilize all available knowledge, be it subjective or observational, so as to most accurately reflect the likelihood of such events. Extending previous work on the statistics of extremes, the Constrained Extremal Distribution Selection Method is a methodology that assists in the selection of probability distributions that characterize the risk of extreme events using expert opinion to constrain the feasible values for parameters which explicitly define a distribution. An extremal distribution is then "fit" to observational data, conditional that the selection of parameters does not violate any constraints. Using a random search technique, genetic algorithms, parameters that minimize a measure of fit between a hypothesized distribution and observational data are estimated. The Constrained Extremal Distribution Selection Method is applied to a real world policy problem faced by the U.S. Environmental Protection Agency. Selected distributions characterize the likelihood of extreme, fatal hazardous material accidents in the United States. These distributions are used to characterize the risk of large scale accidents with numerous fatalities.  相似文献   

4.
A Survey of Approaches for Assessing and Managing the Risk of Extremes   总被引:8,自引:0,他引:8  
In this paper, we review methods for assessing and managing the risk of extreme events, where extreme events are defined to be rare, severe, and outside the normal range of experience of the system in question. First, we discuss several systematic approaches for identifying possible extreme events. We then discuss some issues related to risk assessment of extreme events, including what type of output is needed (e.g., a single probability vs. a probability distribution), and alternatives to the probabilistic approach. Next, we present a number of probabilistic methods. These include: guidelines for eliciting informative probability distributions from experts; maximum entropy distributions; extreme value theory; other approaches for constructing prior distributions (such as reference or noninformative priors); the use of modeling and decomposition to estimate the probability (or distribution) of interest; and bounding methods. Finally, we briefly discuss several approaches for managing the risk of extreme events, and conclude with recommendations and directions for future research.  相似文献   

5.
In their regulations, the U.S. Environmental Protection Agency and the U.S. Nuclear Regulatory Commission permit the omission of features, events, or processes with probabilities of <10(-4) in 10(4) yr (e.g., a constant frequency of <10(-8) per yr) in assessments of the performance of radioactive waste disposal systems. Igneous intrusion (or "volcanism") of a geologic repository at Yucca Mountain for radioactive waste is one disruptive event that has a probability with a range of uncertainty that straddles this regulatory criterion and is considered directly in performance assessment calculations. A self-sustained nuclear chain reaction (or "criticality") is another potentially disruptive event to consider, although it was never found to be important when evaluating the efficacy of radioactive waste disposal since the early 1970s. The thesis of this article is that the consideration of the joint event--volcanism and criticality--occurring in any 10,000-year period following closure can be eliminated from performance calculations at Yucca Mountain. The probability of the joint event must be less than the fairly well-accepted but low probability of volcanism. Furthermore, volcanism does not "remove" or "fail" existing hydrologic or geochemical constraints at Yucca Mountain that tend to prevent concentration of fissile material. Prior to general corrosion failure of waste packages, the mean release of fissile mass caused by a low-probability, igneous intrusive event is so small that the probability of a critical event is remote, even for highly enriched spent nuclear fuel owned by the U.S. Department of Energy. After widespread failure of packages occurs, the probability of the joint event is less than the probability of criticality because of the very small influence of volcanism on the mean fissile mass release. Hence, volcanism plays an insignificant role in inducing criticality over any 10(4)-yr period. We also argue that the Oklo reactors serve as a natural analogue and provide a rough bound on probability of criticality given favorable hydrologic or geochemical conditions on the scale of the repository that is less than 0.10. Because the product of this bound with the probability of volcanism represents the probability of the joint event and the product is less than 10(-4) in 10(4) yr, consideration of the joint event can be eliminated from performance calculations.  相似文献   

6.
This study examines how exploiting biases in probability judgment can enhance deterrence using a fixed allocation of defensive resources. We investigate attacker anchoring heuristics for conjunctive events with missing information to distort attacker estimates of success for targets with equal defensive resources. We designed and conducted a behavioral experiment functioning as an analog cyber attack with multiple targets requiring three stages of attack to successfully acquire a target. Each stage is associated with a probability of successfully attacking a layer of defense, reflecting the allocation of resources for each layer. There are four types of targets that have nearly equal likelihood of being successfully attacked, including one type with equally distributed success probabilities over every layer and three types with success probabilities that are concentrated to be lowest in the first, second, or third layer. Players are incentivized by a payoff system that offers a reward for successfully attacked targets and a penalty for failed attacks. We collected data from a total of 1,600 separate target selections from 80 players and discovered that the target type with the lowest probability of success on the first layer was least preferred among attackers, providing the greatest deterrent. Targets with equally distributed success probabilities across layers were the next least preferred among attackers, indicating greater deterrence for uniform-layered defenses compared to defenses that are concentrated at the inner (second or third) levels. This finding is consistent with both attacker anchoring and ambiguity biases and an interpretation of failed attacks as near misses.  相似文献   

7.
《Risk analysis》2018,38(8):1576-1584
Fault trees are used in reliability modeling to create logical models of fault combinations that can lead to undesirable events. The output of a fault tree analysis (the top event probability) is expressed in terms of the failure probabilities of basic events that are input to the model. Typically, the basic event probabilities are not known exactly, but are modeled as probability distributions: therefore, the top event probability is also represented as an uncertainty distribution. Monte Carlo methods are generally used for evaluating the uncertainty distribution, but such calculations are computationally intensive and do not readily reveal the dominant contributors to the uncertainty. In this article, a closed‐form approximation for the fault tree top event uncertainty distribution is developed, which is applicable when the uncertainties in the basic events of the model are lognormally distributed. The results of the approximate method are compared with results from two sampling‐based methods: namely, the Monte Carlo method and the Wilks method based on order statistics. It is shown that the closed‐form expression can provide a reasonable approximation to results obtained by Monte Carlo sampling, without incurring the computational expense. The Wilks method is found to be a useful means of providing an upper bound for the percentiles of the uncertainty distribution while being computationally inexpensive compared with full Monte Carlo sampling. The lognormal approximation method and Wilks’s method appear attractive, practical alternatives for the evaluation of uncertainty in the output of fault trees and similar multilinear models.  相似文献   

8.
This article discusses the methodologies presently available for analyzing the contribution of "external initiators" to overall risks in the context of PRA (probabilistic risk assessment) of large commercial nuclear power reactors. "External initiators" include earthquakes, fires and floods inside the plant, external floods, high winds, aircraft, barge, and ship collisions, noxious or explosive gases offsite, and so on. These are in contrast to "internal initiators" such as active or passive plant equipment failures, human errors, and loss of electrical power. The ability to consider external initiators within PRA has undergone major advances in recent years. In general, uncertainties associated with the calculated risks from external initiators are much larger than those associated with internal initiators. The principal uncertainties lie with development of hazard curves (such as the frequency of occurrence of an event exceeding a given size: for example, the likelihood of a hurricane with winds exceeding 125 knots). For assessment of earthquakes, internal fires and floods, and high winds, the methodology is reasonably mature for qualitative assessment but not for quantitative application. The risks from other external initiators are generally considered to be low, either because of the very long recurrence time associated with the events or because the plants are judged to be well designed to withstand them.  相似文献   

9.
Building on the Ramsey–de Finetti idea of event exchangeability, we derive a characterization of probabilistic sophistication without requiring any of the various versions of monotonicity, continuity, or comparative likelihood assumptions imposed by Savage (1954), Machina and Schmeidler (1992), and Grant (1995). Our characterization identifies a unique and finitely‐additive subjective probability measure over an algebra of events.  相似文献   

10.
Following the 2013 Chelyabinsk event, the risks posed by asteroids attracted renewed interest, from both the scientific and policy‐making communities. It reminded the world that impacts from near‐Earth objects (NEOs), while rare, have the potential to cause great damage to cities and populations. Point estimates of the risk (such as mean numbers of casualties) have been proposed, but because of the low‐probability, high‐consequence nature of asteroid impacts, these averages provide limited actionable information. While more work is needed to further refine its input distributions (e.g., NEO diameters), the probabilistic model presented in this article allows a more complete evaluation of the risk of NEO impacts because the results are distributions that cover the range of potential casualties. This model is based on a modularized simulation that uses probabilistic inputs to estimate probabilistic risk metrics, including those of rare asteroid impacts. Illustrative results of this analysis are presented for a period of 100 years. As part of this demonstration, we assess the effectiveness of civil defense measures in mitigating the risk of human casualties. We find that they are likely to be beneficial but not a panacea. We also compute the probability—but not the consequences—of an impact with global effects (“cataclysm”). We conclude that there is a continued need for NEO observation, and for analyses of the feasibility and risk‐reduction effectiveness of space missions designed to deflect or destroy asteroids that threaten the Earth.  相似文献   

11.
《Risk analysis》2018,38(8):1534-1540
An extreme space weather event has the potential to disrupt or damage infrastructure systems and technologies that many societies rely on for economic and social well‐being. Space weather events occur regularly, but extreme events are less frequent, with a small number of historical examples over the last 160 years. During the past decade, published works have (1) examined the physical characteristics of the extreme historical events and (2) discussed the probability or return rate of select extreme geomagnetic disturbances, including the 1859 Carrington event. Here we present initial findings on a unified framework approach to visualize space weather event probability, using a Bayesian model average, in the context of historical extreme events. We present disturbance storm time (Dst ) probability (a proxy for geomagnetic disturbance intensity) across multiple return periods and discuss parameters of interest to policymakers and planners in the context of past extreme space weather events. We discuss the current state of these analyses, their utility to policymakers and planners, the current limitations when compared to other hazards, and several gaps that need to be filled to enhance space weather risk assessments.  相似文献   

12.
Probability elicitation protocols are used to assess and incorporate subjective probabilities in risk and decision analysis. While most of these protocols use methods that have focused on the precision of the elicited probabilities, the speed of the elicitation process has often been neglected. However, speed is also important, particularly when experts need to examine a large number of events on a recurrent basis. Furthermore, most existing elicitation methods are numerical in nature, but there are various reasons why an expert would refuse to give such precise ratio‐scale estimates, even if highly numerate. This may occur, for instance, when there is lack of sufficient hard evidence, when assessing very uncertain events (such as emergent threats), or when dealing with politicized topics (such as terrorism or disease outbreaks). In this article, we adopt an ordinal ranking approach from multicriteria decision analysis to provide a fast and nonnumerical probability elicitation process. Probabilities are subsequently approximated from the ranking by an algorithm based on the principle of maximum entropy, a rule compatible with the ordinal information provided by the expert. The method can elicit probabilities for a wide range of different event types, including new ways of eliciting probabilities for stochastically independent events and low‐probability events. We use a Monte Carlo simulation to test the accuracy of the approximated probabilities and try the method in practice, applying it to a real‐world risk analysis recently conducted for DEFRA (the U.K. Department for the Environment, Farming and Rural Affairs): the prioritization of animal health threats.  相似文献   

13.
Probabilistic risk analyses often construct multistage chance trees to estimate the joint probability of compound events. If random measurement error is associated with some or all of the estimates, we show that resulting estimates of joint probability may be highly skewed. Joint probability estimates based on the analysis of multistage chance trees are more likely than not to be below the true probability of adverse events, but will sometimes substantially overestimate them. In contexts such as insurance markets for environmental risks, skewed distributions of risk estimates amplify the "winner's curse" so that the estimated risk premium for low-probability events is likely to be lower than the normative value. Skewness may result even in unbiased estimators of expected value from simple lotteries, if measurement error is associated with both the probability and pay-off terms. Further, skewness may occur even if the error associated with these two estimates is symmetrically distributed. Under certain circumstances, skewed estimates of expected value may result in risk-neutral decisionmakers exhibiting a tendency to choose a certainty equivalent over a lottery of equal expected value, or vice versa. We show that when distributions of estimates of expected value are, positively skewed, under certain circumstances it will be optimal to choose lotteries with nominal values lower than the value of apparently superior certainty equivalents. Extending the previous work of Goodman (1960), we provide an exact formula for the skewness of products.  相似文献   

14.
Scour (localized erosion by water) is an important risk to bridges, and hence many infrastructure networks, around the world. In Britain, scour has caused the failure of railway bridges crossing rivers in more than 50 flood events. These events have been investigated in detail, providing a data set with which we develop and test a model to quantify scour risk. The risk analysis is formulated in terms of a generic, transferrable infrastructure network risk model. For some bridge failures, the severity of the causative flood was recorded or can be reconstructed. These data are combined with the background failure rate, and records of bridges that have not failed, to construct fragility curves that quantify the failure probability conditional on the severity of a flood event. The fragility curves generated are to some extent sensitive to the way in which these data are incorporated into the statistical analysis. The new fragility analysis is tested using flood events simulated from a spatial joint probability model for extreme river flows for all river gauging sites in Britain. The combined models appear robust in comparison with historical observations of the expected number of bridge failures in a flood event. The analysis is used to estimate the probability of single or multiple bridge failures in Britain's rail network. Combined with a model for passenger journey disruption in the event of bridge failure, we calculate a system‐wide estimate for the risk of scour failures in terms of passenger journey disruptions and associated economic costs.  相似文献   

15.
We present a solar-centric approach to estimating the probability of extreme coronal mass ejections (CME) using the Solar and Heliospheric Observatory (SOHO)/Large Angle and Spectrometric Coronagraph Experiment (LASCO) CME Catalog observations updated through May 2018 and an updated list of near-Earth interplanetary coronal mass ejections (ICME). We examine robust statistical approaches to the estimation of extreme events. We then assume a variety of time-independent distributions fitting, and then comparing, the different probability distributions to the relevant regions of the cumulative distributions of the observed CME speeds. Using these results, we then obtain the probability that the velocity of a CME exceeds a particular threshold by extrapolation. We conclude that about 1.72% of the CMEs recorded with SOHO LASCO arrive at the Earth over the time both data sets overlap (November 1996 to September 2017). Then, assuming that 1.72% of all CMEs pass the Earth, we can obtain a first-order estimate of the probability of an extreme space weather event on Earth. To estimate the probability over the next decade of a CME, we fit a Poisson distribution to the complementary cumulative distribution function. We inferred a decadal probability of between 0.01 and 0.09 for an event of at least the size of the large 2012 event, and a probability between 0.0002 and 0.016 for the size of the 1859 Carrington event.  相似文献   

16.
This article focuses on conceptual and methodological developments allowing the integration of physical and social dynamics leading to model forecasts of circumstance‐specific human losses during a flash flood. To reach this objective, a random forest classifier is applied to assess the likelihood of fatality occurrence for a given circumstance as a function of representative indicators. Here, vehicle‐related circumstance is chosen as the literature indicates that most fatalities from flash flooding fall in this category. A database of flash flood events, with and without human losses from 2001 to 2011 in the United States, is supplemented with other variables describing the storm event, the spatial distribution of the sensitive characteristics of the exposed population, and built environment at the county level. The catastrophic flash floods of May 2015 in the states of Texas and Oklahoma are used as a case study to map the dynamics of the estimated probabilistic human risk on a daily scale. The results indicate the importance of time‐ and space‐dependent human vulnerability and risk assessment for short‐fuse flood events. The need for more systematic human impact data collection is also highlighted to advance impact‐based predictive models for flash flood casualties using machine‐learning approaches in the future.  相似文献   

17.
Uncertainty about Probability: A Decision Analysis Perspective   总被引:2,自引:0,他引:2  
The issue of how to think about "uncertainty about probability" is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the median as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, we find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group.  相似文献   

18.
We investigate the regional economic consequences of a hypothetical catastrophic event—attack via radiological dispersal device (RDD)—centered on the downtown Los Angeles area. We distinguish two routes via which such an event might affect regional economic activity: (i) reduction in effective resource supply (the resource loss effect) and (ii) shifts in the perceptions of economic agents (the behavioral effect). The resource loss effect relates to the physical destructiveness of the event, while the behavioral effect relates to changes in fear and risk perception. Both affect the size of the regional economy. RDD detonation causes little capital damage and few casualties, but generates substantial short‐run resource loss via business interruption. Changes in fear and risk perception increase the supply cost of resources to the affected region, while simultaneously reducing demand for goods produced in the region. We use results from a nationwide survey, tailored to our RDD scenario, to inform our model values for behavioral effects. Survey results, supplemented by findings from previous research on stigmatized asset values, suggest that in the region affected by the RDD, households may require higher wages, investors may require higher returns, and customers may require price discounts. We show that because behavioral effects may have lingering long‐term deleterious impacts on both the supply‐cost of resources to a region and willingness to pay for regional output, they can generate changes in regional gross domestic product (GDP) much greater than those generated by resource loss effects. Implications for policies that have the potential to mitigate these effects are discussed.  相似文献   

19.
We describe a quantitative methodology to characterize the vulnerability of U.S. urban centers to terrorist attack, using a place-based vulnerability index and a database of terrorist incidents and related human casualties. Via generalized linear statistical models, we study the relationships between vulnerability and terrorist events, and find that our place-based vulnerability metric significantly describes both terrorist incidence and occurrence of human casualties from terrorist events in these urban centers. We also introduce benchmark analytic technologies from applications in toxicological risk assessment to this social risk/vulnerability paradigm, and use these to distinguish levels of high and low urban vulnerability to terrorism. It is seen that the benchmark approach translates quite flexibly from its biological roots to this social scientific archetype.  相似文献   

20.
The Homeland Security Act mandates the development of a national, risk-based system to support planning for, response to, and recovery from emergency situations involving large-scale toxic exposures. To prepare for and manage consequences effectively, planners and responders need not only to identify zones of potentially elevated individual risk but also to predict expected casualties. Emergency response support systems now define "consequences" by mapping areas in which toxic chemical concentrations do or may exceed Acute Exposure Guideline Levels (AEGLs) or similar guidelines. However, because AEGLs do not estimate expected risks, current unqualified claims that such maps support consequence management are misleading. Intentionally protective, AEGLs incorporate various safety/uncertainty factors depending on the scope and quality of chemical-specific toxicity data. Some of these factors are irrelevant, and others need to be modified, whenever resource constraints or exposure-scenario complexities require responders to make critical trade-off (triage) decisions in order to minimize expected casualties. AEGL-exceedance zones cannot consistently be aggregated, compared, or used to calculate expected casualties and so may seriously misguide emergency response triage decisions. Methods and tools well established and readily available to support environmental health protection are not yet developed for chemically-related environmental health triage. Effective triage decisions involving chemical risks require a new assessment approach that focuses on best estimates of likely casualties, rather than on upper plausible bounds of individual risk. If risk-based consequence management is to become a reality, federal agencies tasked with supporting emergency response must actively coordinate to foster new methods that can support effective environmental health triage.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号