首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Operational risk management of autonomous vehicles in extreme environments is heavily dependent on expert judgments and, in particular, judgments of the likelihood that a failure mitigation action, via correction and prevention, will annul the consequences of a specific fault. However, extant research has not examined the reliability of experts in estimating the probability of failure mitigation. For systems operations in extreme environments, the probability of failure mitigation is taken as a proxy of the probability of a fault not reoccurring. Using a priori expert judgments for an autonomous underwater vehicle mission in the Arctic and a posteriori mission field data, we subsequently developed a generalized linear model that enabled us to investigate this relationship. We found that the probability of failure mitigation alone cannot be used as a proxy for the probability of fault not reoccurring. We conclude that it is also essential to include the effort to implement the failure mitigation when estimating the probability of fault not reoccurring. The effort is the time taken by a person (measured in person-months) to execute the task required to implement the fault correction action. We show that once a modicum of operational data is obtained, it is possible to define a generalized linear logistic model to estimate the probability a fault not reoccurring. We discuss how our findings are important to all autonomous vehicle operations and how similar operations can benefit from revising expert judgments of risk mitigation to take account of the effort required to reduce key risks.  相似文献   

2.
Swati Agiwal 《Risk analysis》2012,32(8):1309-1325
In the aftermath of 9/11, concern over security increased dramatically in both the public and the private sector. Yet, no clear algorithm exists to inform firms on the amount and the timing of security investments to mitigate the impact of catastrophic risks. The goal of this article is to devise an optimum investment strategy for firms to mitigate exposure to catastrophic risks, focusing on how much to invest and when to invest. The latter question addresses the issue of whether postponing a risk mitigating decision is an optimal strategy or not. Accordingly, we develop and estimate both a one‐period model and a multiperiod model within the framework of extreme value theory (EVT). We calibrate these models using probability measures for catastrophic terrorism risks associated with attacks on the food sector. We then compare our findings with the purchase of catastrophic risk insurance.  相似文献   

3.
The dose to human and nonhuman individuals inflicted by anthropogenic radiation is an important issue in international and domestic policy. The current paradigm for nonhuman populations asserts that if the dose to the maximally exposed individuals in a population is below a certain criterion (e.g., <10 mGy d(-1)) then the population is adequately protected. Currently, there is no consensus in the regulatory community as to the best statistical approach. Statistics, currently considered, include the maximum likelihood estimator for the 95th percentile of the sample mean and the sample maximum. Recently, the investigators have proposed the use of the maximum likelihood estimate of a very high quantile as an estimate of dose to the maximally exposed individual. In this study, we compare all of the above-mentioned statistics to an estimate based on extreme value theory. To determine and compare the bias and variance of these statistics, we use Monte Carlo simulation techniques, in a procedure similar to a parametric bootstrap. Our results show that a statistic based on extreme value theory has the least bias of those considered here, but requires reliable estimates of the population size. We recommend establishing the criterion based on what would be considered acceptable if only a small percentage of the population exceeded the limit, and hence recommend using the maximum likelihood estimator of a high quantile in the case that reliable estimates of the population size are not available.  相似文献   

4.
本文在极值理论中引入行为金融学,结合标值自激发点过程(MSEPP)刻画股指收益率极端值序列的集聚性、短期相依性,并将传统的超阈值模型所描述的齐次泊松过程拓展为非齐次泊松过程,探讨投资者情绪对极端收益率的冲击。运用风险偏好指数的方法,基于沪深300指数成份股合成中国投资者情绪指数(EMSI),进一步构建MSEPP-EMSI模型预测沪深300指数、上证综合指数及深圳成分指数的极端风险爆发概率,并对其进行动态ES风险测度。实证结果表明,沪深股市在短期内股指连续暴跌现象时有发生,投资者极度负面情绪会加剧股市的剧烈动荡,当考虑投资者情绪对极端风险的冲击时,MSEPP-EMSI模型能有效的提高对极端风险的概率预测精度及ES预测精度。  相似文献   

5.
We estimate the country-level risk of extreme wildfires defined by burned area (BA) for Mediterranean Europe and carry out a cross-country comparison. To this end, we avail of the European Forest Fire Information System (EFFIS) geospatial data from 2006 to 2019 to perform an extreme value analysis. More specifically, we apply a point process characterization of wildfire extremes using maximum likelihood estimation. By modeling covariates, we also evaluate potential trends and correlations with commonly known factors that drive or affect wildfire occurrence, such as the Fire Weather Index as a proxy for meteorological conditions, population density, land cover type, and seasonality. We find that the highest risk of extreme wildfires is in Portugal (PT), followed by Greece (GR), Spain (ES), and Italy (IT) with a 10-year BA return level of 50'338 ha, 33'242 ha, 25'165 ha, and 8'966 ha, respectively. Coupling our results with existing estimates of the monetary impact of large wildfires suggests expected losses of 162–439 million € (PT), 81–219 million € (ES), 41–290 million € (GR), and 18–78 million € (IT) for such 10-year return period events.

SUMMARY

We model the risk of extreme wildfires for Italy, Greece, Portugal, and Spain in form of burned area return levels, compare them, and estimate expected losses.  相似文献   

6.
准确地度量风险是对风险进行有效管理的前提也是投资者做出合理的投资决策的基础,然而在极端事件频繁发生的情况下,传统的VaR计算方法难以准确地度量股市风险,极值理论却可以很好地解决这一问题。本文特别关注了由2007年美国"次贷" 危机所引发的全球金融危机爆发时我国股市的风险度量问题,考虑到全球股市间极端事件的联动效应,利用基于极值理论的POT模型对上证综指日收益率的尾部数据直接建模拟合分布,进而计算出风险值VaR和CVaR,通过比较危机前后的风险值,发现随着金融危机的到来,我国股市的风险有了一定程度的释放。  相似文献   

7.
If the food sector is attacked, the likely agents will be chemical, biological, or radionuclear (CBRN). We compiled a database of international terrorist/criminal activity involving such agents. Based on these data, we calculate the likelihood of a catastrophic event using extreme value methods. At the present, the probability of an event leading to 5,000 casualties (fatalities and injuries) is between 0.1 and 0.3. However, pronounced, nonstationary patterns within our data suggest that the "reoccurrence period" for such attacks is decreasing every year. Similarly, disturbing trends are evident in a broader data set, which is nonspecific as to the methods or means of attack. While at the present the likelihood of CBRN events is quite low, given an attack, the probability that it involves CBRN agents increases with the number of casualties. This is consistent with evidence of "heavy tails" in the distribution of casualties arising from CBRN events.  相似文献   

8.
A Survey of Approaches for Assessing and Managing the Risk of Extremes   总被引:8,自引:0,他引:8  
In this paper, we review methods for assessing and managing the risk of extreme events, where extreme events are defined to be rare, severe, and outside the normal range of experience of the system in question. First, we discuss several systematic approaches for identifying possible extreme events. We then discuss some issues related to risk assessment of extreme events, including what type of output is needed (e.g., a single probability vs. a probability distribution), and alternatives to the probabilistic approach. Next, we present a number of probabilistic methods. These include: guidelines for eliciting informative probability distributions from experts; maximum entropy distributions; extreme value theory; other approaches for constructing prior distributions (such as reference or noninformative priors); the use of modeling and decomposition to estimate the probability (or distribution) of interest; and bounding methods. Finally, we briefly discuss several approaches for managing the risk of extreme events, and conclude with recommendations and directions for future research.  相似文献   

9.
上世纪90年代出现的巨灾债券是以规避巨灾财产损失为目的的新型非传统风险转移金融创新工具之一,在我国有良好的发展前景。本文针对巨灾风险事件呈现出周期性与不规则的上升特征,构建了BDT过程用以刻画巨灾风险的抵达过程,并基于风险中性测度技术,在随机利率环境与双随机复合泊松损失条件下,导出了巨灾债券定价公式。进而结合伦敦同业银行拆借利率数据与美国保险服务所提供的PCS损失指数估计并校正了模型参数。最后,通过数值模拟检验了利率风险与巨灾风险如何影响巨灾债券的价格,同时验证了定价模型的可行性。  相似文献   

10.
《Risk analysis》2018,38(8):1534-1540
An extreme space weather event has the potential to disrupt or damage infrastructure systems and technologies that many societies rely on for economic and social well‐being. Space weather events occur regularly, but extreme events are less frequent, with a small number of historical examples over the last 160 years. During the past decade, published works have (1) examined the physical characteristics of the extreme historical events and (2) discussed the probability or return rate of select extreme geomagnetic disturbances, including the 1859 Carrington event. Here we present initial findings on a unified framework approach to visualize space weather event probability, using a Bayesian model average, in the context of historical extreme events. We present disturbance storm time (Dst ) probability (a proxy for geomagnetic disturbance intensity) across multiple return periods and discuss parameters of interest to policymakers and planners in the context of past extreme space weather events. We discuss the current state of these analyses, their utility to policymakers and planners, the current limitations when compared to other hazards, and several gaps that need to be filled to enhance space weather risk assessments.  相似文献   

11.
传统EVT方法是从静态的角度,研究超额数据的性质。然而,它没有同时考虑极端数据发生的时间所隐含的充分信息。本文首次在国内提出了非奇次空间动态极值理论(ITD-EVT)的概念,克服了EVT的上述缺陷,在极端数据的基础上考虑了时间因素,并引入多个解释变量,使极值分布的是三个参数为时变的,用二维泊松分布过程建立动态空间模型,是文中一大特色。把TD-EVT运用于极端情况下风险值的估计中,对金融风险管理、资产定价等问题有较大的理论和现实意义。  相似文献   

12.
陆静 《管理工程学报》2012,26(3):136-145
尽管高级计量法由于具有计算精确和节约监管资本等优点而被多数商业银行所青睐,但对于采用哪一种方法来刻画低频高危的操作风险尾部数据却没有一致认识。本文根据巴塞尔委员会关于操作风险计量的原则,采用分块极大值方法和概率加权矩参数估计法,对中国商业银行1990—2009年间的操作风险数据进行了实证。从图形检验和数值检验结果来看,该模型估计的参数具有较高的拟合优度,能够较好地拟合操作风险极端值的尾部分布,为商业银行计量操作风险资本提供了较高的参考价值。  相似文献   

13.
Researchers and commissions contend that the risk of human extinction is high, but none of these estimates have been based upon a rigorous methodology suitable for estimating existential risks. This article evaluates several methods that could be used to estimate the probability of human extinction. Traditional methods evaluated include: simple elicitation; whole evidence Bayesian; evidential reasoning using imprecise probabilities; and Bayesian networks. Three innovative methods are also considered: influence modeling based on environmental scans; simple elicitation using extinction scenarios as anchors; and computationally intensive possible‐worlds modeling. Evaluation criteria include: level of effort required by the probability assessors; level of effort needed to implement the method; ability of each method to model the human extinction event; ability to incorporate scientific estimates of contributory events; transparency of the inputs and outputs; acceptability to the academic community (e.g., with respect to intellectual soundness, familiarity, verisimilitude); credibility and utility of the outputs of the method to the policy community; difficulty of communicating the method's processes and outputs to nonexperts; and accuracy in other contexts. The article concludes by recommending that researchers assess the risks of human extinction by combining these methods.  相似文献   

14.
The analysis of risk-return tradeoffs and their practical applications to portfolio analysis paved the way for Modern Portfolio Theory (MPT), which won Harry Markowitz a 1992 Nobel Prize in Economics. A typical approach in measuring a portfolio's expected return is based on the historical returns of the assets included in a portfolio. On the other hand, portfolio risk is usually measured using volatility, which is derived from the historical variance-covariance relationships among the portfolio assets. This article focuses on assessing portfolio risk, with emphasis on extreme risks. To date, volatility is a major measure of risk owing to its simplicity and validity for relatively small asset price fluctuations. Volatility is a justified measure for stable market performance, but it is weak in addressing portfolio risk under aberrant market fluctuations. Extreme market crashes such as that on October 19, 1987 ("Black Monday") and catastrophic events such as the terrorist attack of September 11, 2001 that led to a four-day suspension of trading on the New York Stock Exchange (NYSE) are a few examples where measuring risk via volatility can lead to inaccurate predictions. Thus, there is a need for a more robust metric of risk. By invoking the principles of the extreme-risk-analysis method through the partitioned multiobjective risk method (PMRM), this article contributes to the modeling of extreme risks in portfolio performance. A measure of an extreme portfolio risk, denoted by f(4), is defined as the conditional expectation for a lower-tail region of the distribution of the possible portfolio returns. This article presents a multiobjective problem formulation consisting of optimizing expected return and f(4), whose solution is determined using Evolver-a software that implements a genetic algorithm. Under business-as-usual market scenarios, the results of the proposed PMRM portfolio selection model are found to be compatible with those of the volatility-based model. However, under extremely unfavorable market conditions, results indicate that f(4) can be a more valid measure of risk than volatility.  相似文献   

15.
We study inference in structural models with a jump in the conditional density, where location and size of the jump are described by regression curves. Two prominent examples are auction models, where the bid density jumps from zero to a positive value at the lowest cost, and equilibrium job‐search models, where the wage density jumps from one positive level to another at the reservation wage. General inference in such models remained a long‐standing, unresolved problem, primarily due to nonregularities and computational difficulties caused by discontinuous likelihood functions. This paper develops likelihood‐based estimation and inference methods for these models, focusing on optimal (Bayes) and maximum likelihood procedures. We derive convergence rates and distribution theory, and develop Bayes and Wald inference. We show that Bayes estimators and confidence intervals are attractive both theoretically and computationally, and that Bayes confidence intervals, based on posterior quantiles, provide a valid large sample inference method.  相似文献   

16.
The Constrained Extremal Distribution Selection Method   总被引:5,自引:0,他引:5  
Engineering design and policy formulation often involve the assessment of the likelihood of future events commonly expressed through a probability distribution. Determination of these distributions is based, when possible, on observational data. Unfortunately, these data are often incomplete, biased, and/or incorrect. These problems are exacerbated when policy formulation involves the risk of extreme events—situations of low likelihood and high consequences. Usually, observational data simply do not exist for such events. Therefore, determination of probabilities which characterize extreme events must utilize all available knowledge, be it subjective or observational, so as to most accurately reflect the likelihood of such events. Extending previous work on the statistics of extremes, the Constrained Extremal Distribution Selection Method is a methodology that assists in the selection of probability distributions that characterize the risk of extreme events using expert opinion to constrain the feasible values for parameters which explicitly define a distribution. An extremal distribution is then "fit" to observational data, conditional that the selection of parameters does not violate any constraints. Using a random search technique, genetic algorithms, parameters that minimize a measure of fit between a hypothesized distribution and observational data are estimated. The Constrained Extremal Distribution Selection Method is applied to a real world policy problem faced by the U.S. Environmental Protection Agency. Selected distributions characterize the likelihood of extreme, fatal hazardous material accidents in the United States. These distributions are used to characterize the risk of large scale accidents with numerous fatalities.  相似文献   

17.
The U.S. federal government regulates the reliability of bulk power systems, while the reliability of power distribution systems is regulated at a state level. In this article, we review the history of regulating electric service reliability and study the existing reliability metrics, indices, and standards for power transmission and distribution networks. We assess the foundations of the reliability standards and metrics, discuss how they are applied to outages caused by large exogenous disturbances such as natural disasters, and investigate whether the standards adequately internalize the impacts of these events. Our reflections shed light on how existing standards conceptualize reliability, question the basis for treating large‐scale hazard‐induced outages differently from normal daily outages, and discuss whether this conceptualization maps well onto customer expectations. We show that the risk indices for transmission systems used in regulating power system reliability do not adequately capture the risks that transmission systems are prone to, particularly when it comes to low‐probability high‐impact events. We also point out several shortcomings associated with the way in which regulators require utilities to calculate and report distribution system reliability indices. We offer several recommendations for improving the conceptualization of reliability metrics and standards. We conclude that while the approaches taken in reliability standards have made considerable advances in enhancing the reliability of power systems and may be logical from a utility perspective during normal operation, existing standards do not provide a sufficient incentive structure for the utilities to adequately ensure high levels of reliability for end‐users, particularly during large‐scale events.  相似文献   

18.
This paper shows the existence of extreme types of zombie firm, i.e. companies with negative equity that continue to do business despite having lost their entire equity. We explain how these firms are measured and how the riskier ones are defined with different determinants. Using a Spanish sample from 2010 to 2014 an index called the EZIndex is developed that includes four dimensions of the extreme zombie problem: extension, contagion, recovery signs and immediacy. The paper contributes to zombie theory on the one hand by developing a method for ranking zombie firms based on risks and changes over time, and on the other hand by using a log-linear model to detect the riskiest corporate profiles out of all these risky firms. It demonstrates significant implications that need to be considered by the competent authorities not only in terms of their impact as a whole but also in regard to the particular profile of extreme zombie firms: they are less regulated, large and located in regions with large business fabrics.  相似文献   

19.
基于操作风险呈厚尾分布的特征,本文按照巴塞尔协议的要求,采用POT极值模型分别估计了多个操作风险单元的边缘分布,然后用多元Copula函数来刻画这些操作风险单元之间的关联性并计算在险价值。通过对中国商业银行1990-2010年操作风险数据的实证分析表明,Clayton Copula能更好地反映各操作风险单元之间的相关性结构,且采用Copula考虑操作风险相关性下的VaR值要比简单加总下的VaR值减少约32.3%。因此,应用Copula函数计量操作风险相关性,不仅可以提高估计的准确性,还能够达到资产组合的风险分散化效应,减少操作风险资本要求,为商业银行提升盈利能力创造条件。  相似文献   

20.
《Risk analysis》2018,38(10):2208-2221
Emergency risk communication (ERC) programs that activate when the ambient temperature is expected to cross certain extreme thresholds are widely used to manage relevant public health risks. In practice, however, the effectiveness of these thresholds has rarely been examined. The goal of this study is to test if the activation criteria based on extreme temperature thresholds, both cold and heat, capture elevated health risks for all‐cause and cause‐specific mortality and morbidity in the Minneapolis‐St. Paul Metropolitan Area. A distributed lag nonlinear model (DLNM) combined with a quasi‐Poisson generalized linear model is used to derive the exposure–response functions between daily maximum heat index and mortality (1998–2014) and morbidity (emergency department visits; 2007–2014). Specific causes considered include cardiovascular, respiratory, renal diseases, and diabetes. Six extreme temperature thresholds, corresponding to 1st–3rd and 97th–99th percentiles of local exposure history, are examined. All six extreme temperature thresholds capture significantly increased relative risks for all‐cause mortality and morbidity. However, the cause‐specific analyses reveal heterogeneity. Extreme cold thresholds capture increased mortality and morbidity risks for cardiovascular and respiratory diseases and extreme heat thresholds for renal disease. Percentile‐based extreme temperature thresholds are appropriate for initiating ERC targeting the general population. Tailoring ERC by specific causes may protect some but not all individuals with health conditions exacerbated by hazardous ambient temperature exposure.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号