首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到19条相似文献,搜索用时 56 毫秒
1.
KMV模型的修正及其应用   总被引:1,自引:0,他引:1  
传统的KMV模型使用公司价值的历史波动率来近似估计波动率,然而通过本文的实证分析,说明这一近似方法在目前的中国市场环境下不适用。文章使用GARCH(1,1)模型来预测公司价值的波动性,以此计算违约距离,并与使用历史波动率计算的违约距离进行比较,认为使用GARCH(1,1)模型计算的结果与实际状况更加相符。  相似文献   

2.
文章主要阐述了KMV模型的基本思想,用期权定价理论来度量公司的信用风险,同时针对资产价值增长率不为零的情况对K/MV模型进行了修正.最后对公司的违约概率进行了敏感性分析.同时总结了KMV模型在度量我国公司信用风险时的优缺点.  相似文献   

3.
统计物理学中的Beck模型具有很好地描述变量的长期记忆和厚尾的特点,文章利用Beck模型和Tsallis熵的最大化理论,对沪市股票指数进行了研究,首先,给出了在Tsallis熵最大化条件下的分布函数,然后,对沪市股票指数数据进行了实证分析,并通过最大似然估计估计出其参数,最后,利用该厚尾分布计算了沪市综合指数的VaR.  相似文献   

4.
基于KMV模型的我国农业银行信用风险管理实证研究   总被引:1,自引:0,他引:1  
文章在充分考虑我国农业银行贷款对象的基础上,选取三十家沪、深上市公司作为研究样本,运用KMV模型度量了上市公司的预期违约率。研究表明,KMV模型总体上能够较好地评估上市公司的信用风险水平,因而适用于我国农业银行的信用风险管理。  相似文献   

5.
为克服传统KMV模型只能应用于单一市场的困难,将多市场的股权价值、股权价值波动的相关性、汇率等因素纳入考虑,建立市场分割条件下的KMV模型。选取了24家A+H上市公司对所建模型进行了实证分析,结果表明,模型对不同公司的违约距离有较好的区分能力。  相似文献   

6.
基于Tompkins方法的KMV模型研究   总被引:1,自引:0,他引:1  
传统的KMV模型使用公司价值历史波动率来近似替代波动率.针对我国股市波动不稳定,尤其是重大经济事件或政治事件的信息披露,以及金融市场上可能存在的影响金融资产价格波动率的季节性或周期性等因素对标的资产市场产生影响较大这一特性,文章把基于历史数据的估计和对波动率变化规律的认识结合起来,综合运用各种定量与定性分析的工具,即运用预测波动率估计的基本思想替代了传统的历史波动率求解方法.实证分析表明,该方法能使模型对信用风险的预测更具准确性和前瞻性.  相似文献   

7.
借用KMV信用风险度量模型,利用我国上市公司2000-2003股票价格的时间序列数据,对我国上市公司的违约频率进行了实证分析.  相似文献   

8.
为了识别金融危机对物流企业信用风险的影响,提高企业经营决策水平,文章选取13家中国物流上市公司作为样本,运用KMV模型计算其在2008年第3季度至2009年第1季度间的违约距离及其变化趋势.研究结果表明,金融危机对物流上市公司信用风险影响显著,而物流上市公司的信用状态对金融危机的影响反应迅速.  相似文献   

9.
上市公司的信用风险同时关系到企业与银行的健康发展.在发达国家,信用风险预测模型越来越严密与精确.文章根据KMV模型的原理,利用我国上市公司的实际数据,采用最小误判法,确定了我国上市公司的违约点和违约距离,井对此违约点及违约距离进行了实证检验.  相似文献   

10.
KMV模型的改进及对上市公司信用风险的度量   总被引:9,自引:1,他引:8  
针对我国资本市场和上市公司的特征,笔者对KMV模型进行了参数的改进,充分考虑了流通股和非流通股的分别计价问题,上市公司的违约点设定问题.因而参数调整后的KMV模型更加符合我国的实际.通过对沪市上市公司的信用风险评估的检验,充分证明参数调整后的KMV模型能够及时准确地识别出我国上市公司的信用质量变化趋势.  相似文献   

11.
利用住房抵押贷款的基本定价模型及美国金融危机时期各相关变量的实际数据,对美国金融危机中的房价、房价波动率及利率三个直接初始因素进行敏感性作用力度弹性分析,探讨了房价、房价波动率及利率三个直接初始因素对住房抵押贷款价值与信用风险的敏感性与作用机理。  相似文献   

12.
Tsallis entropy is a generalized form of entropy and tends to be Shannon entropy when q → 1. Using Tsallis entropy, an alternative estimation methodology (generalized maximum Tsallis entropy) is introduced and used to estimate the parameters in a linear regression model when the basic data are ill-conditioned. We describe the generalized maximum Tsallis entropy and for q = 2 we call that GMET2 estimator. We apply the GMET2 estimator for estimating the linear regression model Y = Xβ + e where the design matrix X is subject to severe multicollinearity. We compared the GMET2, generalized maximum entropy (GME), ordinary least-square (OLS), and inequality restricted least-square (IRLS) estimators on the analyzed dataset on Portland cement.  相似文献   

13.
Approximate Bayesian computation (ABC) is a popular technique for analysing data for complex models where the likelihood function is intractable. It involves using simulation from the model to approximate the likelihood, with this approximate likelihood then being used to construct an approximate posterior. In this paper, we consider methods that estimate the parameters by maximizing the approximate likelihood used in ABC. We give a theoretical analysis of the asymptotic properties of the resulting estimator. In particular, we derive results analogous to those of consistency and asymptotic normality for standard maximum likelihood estimation. We also discuss how sequential Monte Carlo methods provide a natural method for implementing our likelihood‐based ABC procedures.  相似文献   

14.
Abstract.  Properties of a specification test for the parametric form of the variance function in diffusion processes are discussed. The test is based on the estimation of certain integrals of the volatility function. If the volatility function does not depend on the variable x it is known that the corresponding statistics have an asymptotic normal distribution. However, most models of mathematical finance use a volatility function which depends on the state x . In this paper we prove that in the general case, where σ depends also on x the estimates of integrals of the volatility converge stably in law to random variables with a non-standard limit distribution. The limit distribution depends on the diffusion process X t itself and we use this result to develop a bootstrap test for the parametric form of the volatility function, which is consistent in the general diffusion model.  相似文献   

15.
基于Wishart检验的金融市场风险溢出研究   总被引:1,自引:0,他引:1  
在假定两个金融市场均为有效市场的条件下,基于Wishart分布对不同滞后相关系数进行Wishart检验,来确定在这两个金融市场之间的风险溢出发生期和风险溢出强度。实证检验结果显示,沪、深两市之间的风险溢出发生期大约在3分钟之内,且在3分钟的风险溢出发生期内沪市对深市的风险溢出强度较深市对沪市的风险溢出强度衰减速度缓慢,这反映了沪市较深市具有更重要的影响力,该研究结果与金融市场的实际情况吻合。  相似文献   

16.
ABSTRACT

The global financial crisis of 2007–2009 revealed the great extent to which systemic risk can jeopardize the stability of the entire financial system. An effective methodology to quantify systemic risk is at the heart of the process of identifying the so-called systemically important financial institutions for regulatory purposes as well as to investigate key drivers of systemic contagion. The article proposes a method for dynamic forecasting of CoVaR, a popular measure of systemic risk. As a first step, we develop a semi-parametric framework using asymptotic results in the spirit of extreme value theory (EVT) to model the conditional probability distribution of a bivariate random vector given that one of the components takes on a large value, taking into account important features of financial data such as asymmetry and heavy tails. In the second step, we embed the proposed EVT method into a dynamic framework via a bivariate GARCH process. An empirical analysis is conducted to demonstrate and compare the performance of the proposed methodology relative to a very flexible fully parametric alternative.  相似文献   

17.
We consider the filtering model of Frey and Schmidt (2012 Frey , R. , Schmidt , T. ( 2012 ). Pricing and hedging of credit derivatives via the innovations approach to nonlinear filtering . Fin. Stocha. 16 ( 1 ): 105133 .[Crossref], [Web of Science ®] [Google Scholar]) stated under the real probability measure and develop a method for estimating the parameters in this framework by using time-series data of CDS index spreads and classical maximum-likelihood algorithms. The estimation-approach incorporates the Kushner-Stratonovich SDE for the dynamics of the filtering probabilities. The convenient formula for the survival probability is a prerequisite for our estimation algorithm. We apply the developed maximum-likelihood algorithms on market data for historical CDS index spreads (iTraxx Europe Main Series) in order to estimate the parameters in the nonlinear filtering model for an exchangeable credit portfolio. Several such estimations are performed as well as accompanying statistical and numerical computations.  相似文献   

18.
The efficiency of the penalized methods (Fan and Li, 2001 Fan , J. , Li , R. ( 2001 ). Variable selection via nonconcave penalized likelihood and its oracle properties . Journal of the American Statistical Association 96 : 13481360 .[Taylor & Francis Online], [Web of Science ®] [Google Scholar]) depends strongly on a tuning parameter due to the fact that it controls the extent of penalization. Therefore, it is important to select it appropriately. In general, tuning parameters are chosen by data-driven approaches, such as the commonly used generalized cross validation. In this article, we propose an alternative method for the derivation of the tuning parameter selector in penalized least squares framework, which can lead to an ameliorated estimate. Simulation studies are presented to support theoretical findings and a comparison of the Type I and Type II error rates, considering the L 1, the hard thresholding and the Smoothly Clipped Absolute Deviation penalty functions, is performed. The results are given in tables and discussion follows.  相似文献   

19.
This article studies dynamic panel data models in which the long run outcome for a particular cross-section is affected by a weighted average of the outcomes in the other cross-sections. We show that imposing such a structure implies a model with several cointegrating relationships that, unlike in the standard case, are nonlinear in the coe?cients to be estimated. Assuming that the weights are exogenously given, we extend the dynamic ordinary least squares methodology and provide a dynamic two-stage least squares estimator. We derive the large sample properties of our proposed estimator under a set of low-level assumptions. Then our methodology is applied to US financial market data, which consist of credit default swap spreads, as well as firm-specific and industry data. We construct the economic space using a “closeness” measure for firms based on input–output matrices. Our estimates show that this particular form of spatial correlation of credit default swap spreads is substantial and highly significant.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号