首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Abstract

Although stochastic volatility and GARCH (generalized autoregressive conditional heteroscedasticity) models have successfully described the volatility dynamics of univariate asset returns, extending them to the multivariate models with dynamic correlations has been difficult due to several major problems. First, there are too many parameters to estimate if available data are only daily returns, which results in unstable estimates. One solution to this problem is to incorporate additional observations based on intraday asset returns, such as realized covariances. Second, since multivariate asset returns are not synchronously traded, we have to use the largest time intervals such that all asset returns are observed to compute the realized covariance matrices. However, in this study, we fail to make full use of the available intraday informations when there are less frequently traded assets. Third, it is not straightforward to guarantee that the estimated (and the realized) covariance matrices are positive definite.

Our contributions are the following: (1) we obtain the stable parameter estimates for the dynamic correlation models using the realized measures, (2) we make full use of intraday informations by using pairwise realized correlations, (3) the covariance matrices are guaranteed to be positive definite, (4) we avoid the arbitrariness of the ordering of asset returns, (5) we propose the flexible correlation structure model (e.g., such as setting some correlations to be zero if necessary), and (6) the parsimonious specification for the leverage effect is proposed. Our proposed models are applied to the daily returns of nine U.S. stocks with their realized volatilities and pairwise realized correlations and are shown to outperform the existing models with respect to portfolio performances.  相似文献   

2.
This article presents a new way of modeling time-varying volatility. We generalize the usual stochastic volatility models to encompass regime-switching properties. The unobserved state variables are governed by a first-order Markov process. Bayesian estimators are constructed by Gibbs sampling. High-, medium- and low-volatility states are identified for the Standard and Poor's 500 weekly return data. Persistence in volatility is explained by the persistence in the low- and the medium-volatility states. The high-volatility regime is able to capture the 1987 crash and overlap considerably with four U.S. economic recession periods.  相似文献   

3.
We introduce the realized exponential GARCH model that can use multiple realized volatility measures for the modeling of a return series. The model specifies the dynamic properties of both returns and realized measures, and is characterized by a flexible modeling of the dependence between returns and volatility. We apply the model to 27 stocks and an exchange traded fund that tracks the S&P 500 index and find specifications with multiple realized measures that dominate those that rely on a single realized measure. The empirical analysis suggests some convenient simplifications and highlights the advantages of the new specification.  相似文献   

4.
In this paper Bayesian methods are applied to a stochastic volatility model using both the prices of the asset and the prices of options written on the asset. Posterior densities for all model parameters, latent volatilities and the market price of volatility risk are produced via a Markov Chain Monte Carlo (MCMC) sampling algorithm. Candidate draws for the unobserved volatilities are obtained in blocks by applying the Kalman filter and simulation smoother to a linearization of a nonlinear state space representation of the model. Crucially, information from both the spot and option prices affects the draws via the specification of a bivariate measurement equation, with implied Black–Scholes volatilities used to proxy observed option prices in the candidate model. Alternative models nested within the Heston (1993) framework are ranked via posterior odds ratios, as well as via fit, predictive and hedging performance. The method is illustrated using Australian News Corporation spot and option price data.  相似文献   

5.
In this paper Bayesian methods are applied to a stochastic volatility model using both the prices of the asset and the prices of options written on the asset. Posterior densities for all model parameters, latent volatilities and the market price of volatility risk are produced via a Markov Chain Monte Carlo (MCMC) sampling algorithm. Candidate draws for the unobserved volatilities are obtained in blocks by applying the Kalman filter and simulation smoother to a linearization of a nonlinear state space representation of the model. Crucially, information from both the spot and option prices affects the draws via the specification of a bivariate measurement equation, with implied Black-Scholes volatilities used to proxy observed option prices in the candidate model. Alternative models nested within the Heston (1993) framework are ranked via posterior odds ratios, as well as via fit, predictive and hedging performance. The method is illustrated using Australian News Corporation spot and option price data.  相似文献   

6.
Abstract. We investigate simulation methodology for Bayesian inference in Lévy‐driven stochastic volatility (SV) models. Typically, Bayesian inference from such models is performed using Markov chain Monte Carlo (MCMC); this is often a challenging task. Sequential Monte Carlo (SMC) samplers are methods that can improve over MCMC; however, there are many user‐set parameters to specify. We develop a fully automated SMC algorithm, which substantially improves over the standard MCMC methods in the literature. To illustrate our methodology, we look at a model comprised of a Heston model with an independent, additive, variance gamma process in the returns equation. The driving gamma process can capture the stylized behaviour of many financial time series and a discretized version, fit in a Bayesian manner, has been found to be very useful for modelling equity data. We demonstrate that it is possible to draw exact inference, in the sense of no time‐discretization error, from the Bayesian SV model.  相似文献   

7.
This paper proposes and analyses two types of asymmetric multivariate stochastic volatility (SV) models, namely, (i) the SV with leverage (SV-L) model, which is based on the negative correlation between the innovations in the returns and volatility, and (ii) the SV with leverage and size effect (SV-LSE) model, which is based on the signs and magnitude of the returns. The paper derives the state space form for the logarithm of the squared returns, which follow the multivariate SV-L model, and develops estimation methods for the multivariate SV-L and SV-LSE models based on the Monte Carlo likelihood (MCL) approach. The empirical results show that the multivariate SV-LSE model fits the bivariate and trivariate returns of the S&P 500, the Nikkei 225, and the Hang Seng indexes with respect to AIC and BIC more accurately than does the multivariate SV-L model. Moreover, the empirical results suggest that the univariate models should be rejected in favor of their bivariate and trivariate counterparts.  相似文献   

8.
This paper proposes and analyses two types of asymmetric multivariate stochastic volatility (SV) models, namely, (i) the SV with leverage (SV-L) model, which is based on the negative correlation between the innovations in the returns and volatility, and (ii) the SV with leverage and size effect (SV-LSE) model, which is based on the signs and magnitude of the returns. The paper derives the state space form for the logarithm of the squared returns, which follow the multivariate SV-L model, and develops estimation methods for the multivariate SV-L and SV-LSE models based on the Monte Carlo likelihood (MCL) approach. The empirical results show that the multivariate SV-LSE model fits the bivariate and trivariate returns of the S&P 500, the Nikkei 225, and the Hang Seng indexes with respect to AIC and BIC more accurately than does the multivariate SV-L model. Moreover, the empirical results suggest that the univariate models should be rejected in favor of their bivariate and trivariate counterparts.  相似文献   

9.
We propose a Bayesian stochastic search approach to selecting restrictions on multivariate regression models where the errors exhibit deterministic or stochastic conditional volatilities. We develop a Markov chain Monte Carlo (MCMC) algorithm that generates posterior restrictions on the regression coefficients and Cholesky decompositions of the covariance matrix of the errors. Numerical simulations with artificially generated data show that the proposed method is effective in selecting the data-generating model restrictions and improving the forecasting performance of the model. Applying the method to daily foreign exchange rate data, we conduct stochastic search on a VAR model with stochastic conditional volatilities.  相似文献   

10.
We develop a discrete-time affine stochastic volatility model with time-varying conditional skewness (SVS). Importantly, we disentangle the dynamics of conditional volatility and conditional skewness in a coherent way. Our approach allows current asset returns to be asymmetric conditional on current factors and past information, which we term contemporaneous asymmetry. Conditional skewness is an explicit combination of the conditional leverage effect and contemporaneous asymmetry. We derive analytical formulas for various return moments that are used for generalized method of moments (GMM) estimation. Applying our approach to S&P500 index daily returns and option data, we show that one- and two-factor SVS models provide a better fit for both the historical and the risk-neutral distribution of returns, compared to existing affine generalized autoregressive conditional heteroscedasticity (GARCH), and stochastic volatility with jumps (SVJ) models. Our results are not due to an overparameterization of the model: the one-factor SVS models have the same number of parameters as their one-factor GARCH competitors and less than the SVJ benchmark.  相似文献   

11.
This paper deals with the pricing of derivatives written on several underlying assets or factors satisfying a multivariate model with Wishart stochastic volatility matrix. This multivariate stochastic volatility model leads to a closed-form solution for the conditional Laplace transform, and quasi-explicit solutions for derivative prices written on more than one asset or underlying factor. Two examples are presented: (i) a multiasset extension of the stochastic volatility model introduced by Heston (1993), and (ii) a model for credit risk analysis that extends the model of Merton (1974) to a framework with stochastic firm liability, stochastic volatility, and several firms. A bivariate version of the stochastic volatility model is estimated using stock prices and moment conditions derived from the joint unconditional Laplace transform of the stock returns.  相似文献   

12.
This article discusses some topics relevant to financial modeling. The kurtosis of a distribution plays an important role in controlling tail-behavior and is used in edgeworth expansion of the call prices. We present derivations of the kurtosis for a number of popular volatility models useful in financial applications, including the class of random coefficient GARCH models. Option pricing formulas for various classes of volatility models are also derived and a simple proof of the option pricing formula under the Black–Scholes model is given.  相似文献   

13.
New techniques for the analysis of stochastic volatility models in which the logarithm of conditional variance follows an autoregressive model are developed. A cyclic Metropolis algorithm is used to construct a Markov-chain simulation tool. Simulations from this Markov chain converge in distribution to draws from the posterior distribution enabling exact finite-sample inference. The exact solution to the filtering/smoothing problem of inferring about the unobserved variance states is a by-product of our Markov-chain method. In addition, multistep-ahead predictive densities can be constructed that reflect both inherent model variability and parameter uncertainty. We illustrate our method by analyzing both daily and weekly data on stock returns and exchange rates. Sampling experiments are conducted to compare the performance of Bayes estimators to method of moments and quasi-maximum likelihood estimators proposed in the literature. In both parameter estimation and filtering, the Bayes estimators outperform these other approaches.  相似文献   

14.
We develop a Bayesian approach for parsimoniously estimating the correlation structure of the errors in a multivariate stochastic volatility model. Since the number of parameters in the joint correlation matrix of the return and volatility errors is potentially very large, we impose a prior that allows the off-diagonal elements of the inverse of the correlation matrix to be identically zero. The model is estimated using a Markov chain simulation method that samples from the posterior distribution of the volatilities and parameters. We illustrate the approach using both simulated and real examples. In the real examples, the method is applied to equities at three levels of aggregation: returns for firms within the same industry, returns for different industries, and returns aggregated at the index level. We find pronounced correlation effects only at the highest level of aggregation.  相似文献   

15.
《Econometric Reviews》2008,27(1):139-162
The quality of the asymptotic normality of realized volatility can be poor if sampling does not occur at very high frequencies. In this article we consider an alternative approximation to the finite sample distribution of realized volatility based on Edgeworth expansions. In particular, we show how confidence intervals for integrated volatility can be constructed using these Edgeworth expansions. The Monte Carlo study we conduct shows that the intervals based on the Edgeworth corrections have improved properties relatively to the conventional intervals based on the normal approximation. Contrary to the bootstrap, the Edgeworth approach is an analytical approach that is easily implemented, without requiring any resampling of one's data. A comparison between the bootstrap and the Edgeworth expansion shows that the bootstrap outperforms the Edgeworth corrected intervals. Thus, if we are willing to incur in the additional computational cost involved in computing bootstrap intervals, these are preferred over the Edgeworth intervals. Nevertheless, if we are not willing to incur in this additional cost, our results suggest that Edgeworth corrected intervals should replace the conventional intervals based on the first order normal approximation.  相似文献   

16.
We develop a Bayesian approach for parsimoniously estimating the correlation structure of the errors in a multivariate stochastic volatility model. Since the number of parameters in the joint correlation matrix of the return and volatility errors is potentially very large, we impose a prior that allows the off-diagonal elements of the inverse of the correlation matrix to be identically zero. The model is estimated using a Markov chain simulation method that samples from the posterior distribution of the volatilities and parameters. We illustrate the approach using both simulated and real examples. In the real examples, the method is applied to equities at three levels of aggregation: returns for firms within the same industry, returns for different industries, and returns aggregated at the index level. We find pronounced correlation effects only at the highest level of aggregation.  相似文献   

17.
The quality of the asymptotic normality of realized volatility can be poor if sampling does not occur at very high frequencies. In this article we consider an alternative approximation to the finite sample distribution of realized volatility based on Edgeworth expansions. In particular, we show how confidence intervals for integrated volatility can be constructed using these Edgeworth expansions. The Monte Carlo study we conduct shows that the intervals based on the Edgeworth corrections have improved properties relatively to the conventional intervals based on the normal approximation. Contrary to the bootstrap, the Edgeworth approach is an analytical approach that is easily implemented, without requiring any resampling of one's data. A comparison between the bootstrap and the Edgeworth expansion shows that the bootstrap outperforms the Edgeworth corrected intervals. Thus, if we are willing to incur in the additional computational cost involved in computing bootstrap intervals, these are preferred over the Edgeworth intervals. Nevertheless, if we are not willing to incur in this additional cost, our results suggest that Edgeworth corrected intervals should replace the conventional intervals based on the first order normal approximation.  相似文献   

18.
In this paper we model the Gaussian errors in the standard Gaussian linear state space model as stochastic volatility processes. We show that conventional MCMC algorithms for this class of models are ineffective, but that the problem can be alleviated by reparameterizing the model. Instead of sampling the unobserved variance series directly, we sample in the space of the disturbances, which proves to lower correlation in the sampler and thus increases the quality of the Markov chain.

Using our reparameterized MCMC sampler, it is possible to estimate an unobserved factor model for exchange rates between a group of n countries. The underlying n + 1 country-specific currency strength factors and the n + 1 currency volatility factors can be extracted using the new methodology. With the factors, a more detailed image of the events around the 1992 EMS crisis is obtained.

We assess the fit of competitive models on the panels of exchange rates with an effective particle filter and find that indeed the factor model is strongly preferred by the data.  相似文献   

19.
Common loss functions used for the restoration of grey scale images include the zero–one loss and the sum of squared errors. The corresponding estimators, the posterior mode and the posterior marginal mean, are optimal Bayes estimators with respect to their way of measuring the loss for different error configurations. However, both these loss functions have a fundamental weakness: the loss does not depend on the spatial structure of the errors. This is important because a systematic structure in the errors can lead to misinterpretation of the estimated image. We propose a new loss function that also penalizes strong local sample covariance in the error and we discuss how the optimal Bayes estimator can be estimated using a two-step Markov chain Monte Carlo and simulated annealing algorithm. We present simulation results for some artificial data which show improvement with respect to small structures in the image.  相似文献   

20.
The authors present theoretical results that show how one can simulate a mixture distribution whose components live in subspaces of different dimension by reformulating the problem in such a way that observations may be drawn from an auxiliary continuous distribution on the largest subspace and then transformed in an appropriate fashion. Motivated by the importance of enlarging the set of available Markov chain Monte Carlo (MCMC) techniques, the authors show how their results can be fruitfully employed in problems such as model selection (or averaging) of nested models, or regeneration of Markov chains for evaluating standard deviations of estimated expectations derived from MCMC simulations.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号