首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 171 毫秒
1.
基于Copula-APD-GARCH模型的投资组合有效前沿分析   总被引:1,自引:0,他引:1  
根据ES风险度量方法,拓展了马克维茨均值-方差资产组合模型,研究均值-ES准则下的资产组合问题.用APD-GARCH模型刻画风险资产收益率序列,以多元Copula函数描述风险资产间的相关结构信息,构造灵活的Copula-APDG-ARCH模型.利用该模型,借助Monte Carlo模拟,分别研究相关结构是多元正态Copula函数、多元t-Copula函数和多元Clayton Copula函数的风险资产组合的均值一ES有效前沿,并进行比较.实证研究表明,在有效组合范围内,正态Copula函数明显高估了资产的组合风险;当期望收益较小时,t-Copula函数对应的风险值最小,但随着期望收益的增加,多元Clayton Copula函数时应的有效前沿表现最好.  相似文献   

2.
This article presents a flood risk analysis model that considers the spatially heterogeneous nature of flood events. The basic concept of this approach is to generate a large sample of flood events that can be regarded as temporal extrapolation of flood events. These are combined with cumulative flood impact indicators, such as building damages, to finally derive time series of damages for risk estimation. Therefore, a multivariate modeling procedure that is able to take into account the spatial characteristics of flooding, the regionalization method top‐kriging, and three different impact indicators are combined in a model chain. Eventually, the expected annual flood impact (e.g., expected annual damages) and the flood impact associated with a low probability of occurrence are determined for a study area. The risk model has the potential to augment the understanding of flood risk in a region and thereby contribute to enhanced risk management of, for example, risk analysts and policymakers or insurance companies. The modeling framework was successfully applied in a proof‐of‐concept exercise in Vorarlberg (Austria). The results of the case study show that risk analysis has to be based on spatially heterogeneous flood events in order to estimate flood risk adequately.  相似文献   

3.
In this article, we propose an integrated direct and indirect flood risk model for small‐ and large‐scale flood events, allowing for dynamic modeling of total economic losses from a flood event to a full economic recovery. A novel approach is taken that translates direct losses of both capital and labor into production losses using the Cobb‐Douglas production function, aiming at improved consistency in loss accounting. The recovery of the economy is modeled using a hybrid input‐output model and applied to the port region of Rotterdam, using six different flood events (1/10 up to 1/10,000). This procedure allows gaining a better insight regarding the consequences of both high‐ and low‐probability floods. The results show that in terms of expected annual damage, direct losses remain more substantial relative to the indirect losses (approximately 50% larger), but for low‐probability events the indirect losses outweigh the direct losses. Furthermore, we explored parameter uncertainty using a global sensitivity analysis, and varied critical assumptions in the modeling framework related to, among others, flood duration and labor recovery, using a scenario approach. Our findings have two important implications for disaster modelers and practitioners. First, high‐probability events are qualitatively different from low‐probability events in terms of the scale of damages and full recovery period. Second, there are substantial differences in parameter influence between high‐probability and low‐probability flood modeling. These findings suggest that a detailed approach is required when assessing the flood risk for a specific region.  相似文献   

4.
Scour (localized erosion by water) is an important risk to bridges, and hence many infrastructure networks, around the world. In Britain, scour has caused the failure of railway bridges crossing rivers in more than 50 flood events. These events have been investigated in detail, providing a data set with which we develop and test a model to quantify scour risk. The risk analysis is formulated in terms of a generic, transferrable infrastructure network risk model. For some bridge failures, the severity of the causative flood was recorded or can be reconstructed. These data are combined with the background failure rate, and records of bridges that have not failed, to construct fragility curves that quantify the failure probability conditional on the severity of a flood event. The fragility curves generated are to some extent sensitive to the way in which these data are incorporated into the statistical analysis. The new fragility analysis is tested using flood events simulated from a spatial joint probability model for extreme river flows for all river gauging sites in Britain. The combined models appear robust in comparison with historical observations of the expected number of bridge failures in a flood event. The analysis is used to estimate the probability of single or multiple bridge failures in Britain's rail network. Combined with a model for passenger journey disruption in the event of bridge failure, we calculate a system‐wide estimate for the risk of scour failures in terms of passenger journey disruptions and associated economic costs.  相似文献   

5.
Floods are a natural hazard evolving in space and time according to meteorological and river basin dynamics, so that a single flood event can affect different regions over the event duration. This physical mechanism introduces spatio‐temporal relationships between flood records and losses at different locations over a given time window that should be taken into account for an effective assessment of the collective flood risk. However, since extreme floods are rare events, the limited number of historical records usually prevents a reliable frequency analysis. To overcome this limit, we move from the analysis of extreme events to the modeling of continuous stream flow records preserving spatio‐temporal correlation structures of the entire process, and making a more efficient use of the information provided by continuous flow records. The approach is based on the dynamic copula framework, which allows for splitting the modeling of spatio‐temporal properties by coupling suitable time series models accounting for temporal dynamics, and multivariate distributions describing spatial dependence. The model is applied to 490 stream flow sequences recorded across 10 of the largest river basins in central and eastern Europe (Danube, Rhine, Elbe, Oder, Waser, Meuse, Rhone, Seine, Loire, and Garonne). Using available proxy data to quantify local flood exposure and vulnerability, we show that the temporal dependence exerts a key role in reproducing interannual persistence, and thus magnitude and frequency of annual proxy flood losses aggregated at a basin‐wide scale, while copulas allow the preservation of the spatial dependence of losses at weekly and annual time scales.  相似文献   

6.
多项式Copula方法对市场相关结构的分析   总被引:1,自引:1,他引:0  
Copula函数的出现解决了如何为描述市场相关性的问题,同时也为构建多元函数的联合分布提供了一种可行的方法。然而由于高维copula函数的拟合相对比较困难,本文首次通过对高维copula的Bernstein多项式展开,建立关于市场相关结构的多参数线性模型。然后将沪深两市的指数经过GARCH处理后,作为经验数据带入进行多项式回归。从而验证了运用多项式逼近来描述市场相关结构的方法是可行的。  相似文献   

7.
The catastrophic nature of seismic risk is attributed to spatiotemporal correlation of seismic losses of buildings and infrastructure. For seismic risk management, such correlated seismic effects must be adequately taken into account, since they affect the probability distribution of aggregate seismic losses of spatially distributed structures significantly, and its upper tail behavior can be of particular importance. To investigate seismic loss dependence for two closely located portfolios of buildings, simulated seismic loss samples, which are obtained from a seismic risk model of spatially distributed buildings by taking spatiotemporally correlated ground motions into account, are employed. The characterization considers a loss frequency model that incorporates one dependent random component acting as a common shock to all buildings, and a copula‐based loss severity model, which facilitates the separate construction of marginal loss distribution functions and nonlinear copula function with upper tail dependence. The proposed method is applied to groups of wood‐frame buildings located in southwestern British Columbia. Analysis results indicate that the dependence structure of aggregate seismic losses can be adequately modeled by the right heavy tail copula or Gumbel copula, and that for the considered example, overall accuracy of the proposed method is satisfactory at probability levels of practical interest (at most 10% estimation error of fractiles of aggregate seismic loss). The developed statistical seismic loss model may be adopted in dynamic financial analysis for achieving faster evaluation with reasonable accuracy.  相似文献   

8.
Risk of Extreme Events Under Nonstationary Conditions   总被引:5,自引:0,他引:5  
The concept of the return period is widely used in the analysis of the risk of extreme events and in engineering design. For example, a levee can be designed to protect against the 100-year flood, the flood which on average occurs once in 100 years. Use of the return period typically assumes that the probability of occurrence of an extreme event in the current or any future year is the same. However, there is evidence that potential climate change may affect the probabilities of some extreme events such as floods and droughts. In turn, this would affect the level of protection provided by the current infrastructure. For an engineering project, the risk of an extreme event in a future year could greatly exceed the average annual risk over the design life of the project. An equivalent definition of the return period under stationary conditions is the expected waiting time before failure. This paper examines how this definition can be adapted to nonstationary conditions. Designers of flood control projects should be aware that alternative definitions of the return period imply different risk under nonstationary conditions. The statistics of extremes and extreme value distributions are useful to examine extreme event risk. This paper uses a Gumbel Type I distribution to model the probability of failure under nonstationary conditions. The probability of an extreme event under nonstationary conditions depends on the rate of change of the parameters of the underlying distribution.  相似文献   

9.
Flooding is increasing worldwide, and with climate change, people need help understanding these changing conditions and that their flood risk may also change. This study extends the planned risk information seeking model (PRISM) into the flood risk domain and examines the antecedents that explain flood risk information seeking behavior. Using a survey reflective of the population in the state of Texas (N = 1079), this study includes an operationalization of risk perception specific to the complexity of floods and explores two key moderators in the PRISM model. Findings suggest that using PRISM to elaborate flood risk information seeking behaviors explains 48% of the variance in information seeking intent and 37% of the variance in affective risk perception. Using multigroup modeling, the findings also reveal that simply living in an area at high risk for floods does not significantly impact any relationships in the model. However, having experience with flooding increases the strength of risk perception paths—in particular, perceived probability of flood risk—and better explains flood risk information seeking. Suggestions for how to use communication to influence risk perceptions and information seeking, as well as future directions for research, are also discussed.  相似文献   

10.
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location‐scale families (including the log‐normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications.  相似文献   

11.
Yifan Zhang 《Risk analysis》2013,33(1):109-120
Expert judgment (or expert elicitation) is a formal process for eliciting judgments from subject‐matter experts about the value of a decision‐relevant quantity. Judgments in the form of subjective probability distributions are obtained from several experts, raising the question how best to combine information from multiple experts. A number of algorithmic approaches have been proposed, of which the most commonly employed is the equal‐weight combination (the average of the experts’ distributions). We evaluate the properties of five combination methods (equal‐weight, best‐expert, performance, frequentist, and copula) using simulated expert‐judgment data for which we know the process generating the experts’ distributions. We examine cases in which two well‐calibrated experts are of equal or unequal quality and their judgments are independent, positively or negatively dependent. In this setting, the copula, frequentist, and best‐expert approaches perform better and the equal‐weight combination method performs worse than the alternative approaches.  相似文献   

12.
This paper tests whether (and to what extent) default risk dependencies changed during the subprime crisis in 2007 and 2008. This is done by applying a Goodness-of-fit test, based on the Rosenblatt transformation, to test various null hypotheses with respect to the copula function that describes the stochastic dependence between daily returns of six sector-specific subindices of the Dow Jones iTraxx Credit Default Swap index for Europe. Overall, the results suggest that in the bivariate case, the t-copula is a better approximation to the true copula of returns of DJ iTraxx subindices than the normal copula or the generalized Clayton copula. On average, the number of degrees of freedom of the bivariate t-copula tends to decrease during the crisis. As expected, the correlation between the returns of the subindices increases significantly during the crisis. However, the multivariate analysis reveals that it is only before the crisis that the null hypothesis of a six-dimensional t-copula is not rejected. During the crisis, the multivariate stochastic dependence between the sector-specific DJ iTraxx subindices seems to change in such a complex way that it is no longer sufficiently described by a multivariate t-copula.  相似文献   

13.
The development of catastrophe models in recent years allows for assessment of the flood hazard much more effectively than when the federally run National Flood Insurance Program (NFIP) was created in 1968. We propose and then demonstrate a methodological approach to determine pure premiums based on the entire distribution of possible flood events. We apply hazard, exposure, and vulnerability analyses to a sample of 300,000 single‐family residences in two counties in Texas (Travis and Galveston) using state‐of‐the‐art flood catastrophe models. Even in zones of similar flood risk classification by FEMA there is substantial variation in exposure between coastal and inland flood risk. For instance, homes in the designated moderate‐risk X500/B zones in Galveston are exposed to a flood risk on average 2.5 times greater than residences in X500/B zones in Travis. The results also show very similar average annual loss (corrected for exposure) for a number of residences despite their being in different FEMA flood zones. We also find significant storm‐surge exposure outside of the FEMA designated storm‐surge risk zones. Taken together these findings highlight the importance of a microanalysis of flood exposure. The process of aggregating risk at a flood zone level—as currently undertaken by FEMA—provides a false sense of uniformity. As our analysis indicates, the technology to delineate the flood risks exists today.  相似文献   

14.
Modeling the dependence between uncertainties in decision and risk analyses is an important part of the problem structuring process. We focus on situations where correlated uncertainties are discrete, and extend the concept of the copula‐based approach for modeling correlated continuous uncertainties to the representation of correlated discrete uncertainties. This approach reduces the required number of probability assessments significantly compared to approaches requiring direct estimates of conditional probabilities. It also allows the use of multiple dependence measures, including product moment correlation, rank order correlation and tail dependence, and parametric families of copulas such as normal copulas, t‐copulas, and Archimedean copulas. This approach can be extended to model the dependence between discrete and continuous uncertainties in the same event tree.  相似文献   

15.
Evolving geopolitical relationships between countries (especially between China and the United States) in recent years have highlighted dynamically changing trade patterns across the globe, all of which elevate risk and uncertainty for transport service providers. In order to mitigate risks, shipowners and operators must be able to estimate risks appropriately; one potentially promising method of doing so is through the value-at-risk (VaR) method. VaR describes the worst loss a portfolio is likely to sustain, which will not be exceeded over a target time horizon at a given level of confidence. This article proposes a copula-based GARCH model to estimate the joint multivariate distribution, which is a key component in VaR estimation. We show that the copula model can capture the VaR more successfully, as compared with the traditional method of calculation. As an empirical study, the expected portfolio VaR is examined when a shipowner chooses among Panamax soybean trading routes under a condition of reduced trade volumes between the United States and China due to the ongoing trade turmoil. This study serves as one of the very few papers in the literature on shipping portfolio VaR analysis. The results have significant implications for shipowners regarding fleet repositioning, decision making, and risk management.  相似文献   

16.
运用ARFIMA-FIAPARCH-skst模型对沪深300指数和香港恒生指数建立收益-波动模型, 然后结合估计的参数对模型进行修正以确立最终模型, 排除金融市场典型事实对相依关系的影响, 进而运用由Clayton、Frank和Gumbel组成的混合copula模型对相依结构进行建模。研究结果表明:内地市场和香港市场均未观察到显著的杠杆效应;由Clayton、Frank和Gumbel组成的混合Copula模型能够准确地描述两个市场之间的相依结构, 且两个市场下尾相依关系要强于上尾的相依关系, 通过动态混合copula也验证了这一明显的非对称关系。  相似文献   

17.
The aging domestic oil production infrastructure represents a high risk to the environment because of the type of fluids being handled (oil and brine) and the potential for accidental release of these fluids into sensitive ecosystems. Currently, there is not a quantitative risk model directly applicable to onshore oil exploration and production (E&P) facilities. We report on a probabilistic reliability model created for onshore exploration and production (E&P) facilities. Reliability theory, failure modes and effects analysis (FMEA), and event trees were used to develop the model estimates of the failure probability of typical oil production equipment. Monte Carlo simulation was used to translate uncertainty in input parameter values to uncertainty in the model output. The predicted failure rates were calibrated to available failure rate information by adjusting probability density function parameters used as random variates in the Monte Carlo simulations. The mean and standard deviation of normal variate distributions from which the Weibull distribution characteristic life was chosen were used as adjustable parameters in the model calibration. The model was applied to oil production leases in the Tallgrass Prairie Preserve, Oklahoma. We present the estimated failure probability due to the combination of the most significant failure modes associated with each type of equipment (pumps, tanks, and pipes). The results show that the estimated probability of failure for tanks is about the same as that for pipes, but that pumps have much lower failure probability. The model can provide necessary equipment reliability information for proactive risk management at the lease level by providing quantitative information to base allocation of maintenance resources to high-risk equipment that will minimize both lost production and ecosystem damage.  相似文献   

18.
Flood loss modeling is an important component for risk analyses and decision support in flood risk management. Commonly, flood loss models describe complex damaging processes by simple, deterministic approaches like depth‐damage functions and are associated with large uncertainty. To improve flood loss estimation and to provide quantitative information about the uncertainty associated with loss modeling, a probabilistic, multivariable B agging decision T ree F lood L oss E stimation MO del (BT‐FLEMO) for residential buildings was developed. The application of BT‐FLEMO provides a probability distribution of estimated losses to residential buildings per municipality. BT‐FLEMO was applied and validated at the mesoscale in 19 municipalities that were affected during the 2002 flood by the River Mulde in Saxony, Germany. Validation was undertaken on the one hand via a comparison with six deterministic loss models, including both depth‐damage functions and multivariable models. On the other hand, the results were compared with official loss data. BT‐FLEMO outperforms deterministic, univariable, and multivariable models with regard to model accuracy, although the prediction uncertainty remains high. An important advantage of BT‐FLEMO is the quantification of prediction uncertainty. The probability distribution of loss estimates by BT‐FLEMO well represents the variation range of loss estimates of the other models in the case study.  相似文献   

19.
提出一种基于小波域隐马尔可夫模型的时间序列分析方法.首先介绍了离散小波变换;并针对小波系数进行统计建模,分别讨论了单个小波系数的混合高斯模型、不同尺度小波系数之间的隐马尔可夫树结构、模型训练及似然计算等问题;其次,提出了关于时间序列插值、平滑和预测的统一数学模型,并运用极大后验概率估计和贝叶斯原理,将小波域隐马尔可夫模型作为先验知识给出了一种分析时间序列的新方法;然后,详细推导了时间序列重建问题的Euler-Lagrange方程及对数似然的导数计算,将时间序列的插值、平滑和预测归结为一个简单线性方程的求解;最后通过期望极大化(EM)算法和共扼梯度算法进行交替迭代来计算小波域隐马尔可夫模型参数和重建时间序列.实验结果表明该方法在经济领域时间序列分析中的有效性.  相似文献   

20.
As flood risks grow worldwide, a well‐designed insurance program engaging various stakeholders becomes a vital instrument in flood risk management. The main challenge concerns the applicability of standard approaches for calculating insurance premiums of rare catastrophic losses. This article focuses on the design of a flood‐loss‐sharing program involving private insurance based on location‐specific exposures. The analysis is guided by a developed integrated catastrophe risk management (ICRM) model consisting of a GIS‐based flood model and a stochastic optimization procedure with respect to location‐specific risk exposures. To achieve the stability and robustness of the program towards floods with various recurrences, the ICRM uses stochastic optimization procedure, which relies on quantile‐related risk functions of a systemic insolvency involving overpayments and underpayments of the stakeholders. Two alternative ways of calculating insurance premiums are compared: the robust derived with the ICRM and the traditional average annual loss approach. The applicability of the proposed model is illustrated in a case study of a Rotterdam area outside the main flood protection system in the Netherlands. Our numerical experiments demonstrate essential advantages of the robust premiums, namely, that they: (1) guarantee the program's solvency under all relevant flood scenarios rather than one average event; (2) establish a tradeoff between the security of the program and the welfare of locations; and (3) decrease the need for other risk transfer and risk reduction measures.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号