首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Summary.  Recently there has been much work on developing models that are suitable for analysing the volatility of a continuous time process. One general approach is to define a volatility process as the convolution of a kernel with a non-decreasing Lévy process, which is non-negative if the kernel is non-negative. Within the framework of time continuous autoregressive moving average (CARMA) processes, we derive a necessary and sufficient condition for the kernel to be non-negative. This condition is in terms of the Laplace transform of the CARMA kernel, which has a simple form. We discuss some useful consequences of this result and delineate the parametric region of stationarity and non-negative kernel for some lower order CARMA models.  相似文献   

2.
Continuous-time autoregressive moving average (CARMA) processes with a nonnegative kernel and driven by a nondecreasing Lévy process constitute a useful and very general class of stationary, nonnegative continuous-time processes that have been used, in particular, for the modeling of stochastic volatility. Brockwell, Davis, and Yang (2007) derived efficient estimates of the parameters of a nonnegative Lévy-driven CAR(1) process and showed how the realization of the underlying Lévy process can be estimated from closely-spaced observations of the process itself. In this article we show how the ideas of that article can be generalized to higher order CARMA processes with nonnegative kernel, the key idea being the decomposition of the CARMA process into a sum of dependent Ornstein–Uhlenbeck processes.  相似文献   

3.
Multivariate stochastic volatility models with skew distributions are proposed. Exploiting Cholesky stochastic volatility modeling, univariate stochastic volatility processes with leverage effect and generalized hyperbolic skew t-distributions are embedded to multivariate analysis with time-varying correlations. Bayesian modeling allows this approach to provide parsimonious skew structure and to easily scale up for high-dimensional problem. Analyses of daily stock returns are illustrated. Empirical results show that the time-varying correlations and the sparse skew structure contribute to improved prediction performance and Value-at-Risk forecasts.  相似文献   

4.
Summary.  We develop Markov chain Monte Carlo methodology for Bayesian inference for non-Gaussian Ornstein–Uhlenbeck stochastic volatility processes. The approach introduced involves expressing the unobserved stochastic volatility process in terms of a suitable marked Poisson process. We introduce two specific classes of Metropolis–Hastings algorithms which correspond to different ways of jointly parameterizing the marked point process and the model parameters. The performance of the methods is investigated for different types of simulated data. The approach is extended to consider the case where the volatility process is expressed as a superposition of Ornstein–Uhlenbeck processes. We apply our methodology to the US dollar–Deutschmark exchange rate.  相似文献   

5.
Linear vector autoregressive (VAR) models where the innovations could be unconditionally heteroscedastic are considered. The volatility structure is deterministic and quite general, including breaks or trending variances as special cases. In this framework we propose ordinary least squares (OLS), generalized least squares (GLS) and adaptive least squares (ALS) procedures. The GLS estimator requires the knowledge of the time-varying variance structure while in the ALS approach the unknown variance is estimated by kernel smoothing with the outer product of the OLS residual vectors. Different bandwidths for the different cells of the time-varying variance matrix are also allowed. We derive the asymptotic distribution of the proposed estimators for the VAR model coefficients and compare their properties. In particular we show that the ALS estimator is asymptotically equivalent to the infeasible GLS estimator. This asymptotic equivalence is obtained uniformly with respect to the bandwidth(s) in a given range and hence justifies data-driven bandwidth rules. Using these results we build Wald tests for the linear Granger causality in mean which are adapted to VAR processes driven by errors with a nonstationary volatility. It is also shown that the commonly used standard Wald test for the linear Granger causality in mean is potentially unreliable in our framework (incorrect level and lower asymptotic power). Monte Carlo experiments illustrate the use of the different estimation approaches for the analysis of VAR models with time-varying variance innovations.  相似文献   

6.
Risks are usually represented and measured by volatility–covolatility matrices. Wishart processes are models for a dynamic analysis of multivariate risk and describe the evolution of stochastic volatility–covolatility matrices, constrained to be symmetric positive definite. The autoregressive Wishart process (WAR) is the multivariate extension of the Cox, Ingersoll, Ross (CIR) process introduced for scalar stochastic volatility. As a CIR process it allows for closed-form solutions for a number of financial problems, such as term structure of T-bonds and corporate bonds, derivative pricing in a multivariate stochastic volatility model, and the structural model for credit risk. Moreover, the Wishart dynamics are very flexible and are serious competitors for less structural multivariate ARCH models.  相似文献   

7.
In this article I present a new approach to model more realistically the variability of financial time series. I develop a Markov-ARCH model that incorporates the features of both Hamilton's switching-regime model and Engle's autoregressive conditional heteroscedasticity (ARCH) model to examine the issue of volatility persistence in the monthly excess returns of the three-month treasury bill. The issue can be resolved by taking into account occasional shifts in the asymptotic variance of the Markov-ARCH process that cause the spurious persistence of the volatility process. I identify two periods during which there is a regime shift, the 1974:2–1974:8 period associated with the oil shock and the 1979:9–1982:8 period associated with the Federal Reserve's policy change. The variance approached asymptotically in these two episodes is more than 10 times as high as the asymptotic variance for the remainder of the sample. I conclude that regime shifts have a greater impact on the properties of the data, and I cannot reject the null hypothesis of no ARCH effects within the regimes. As a consequence of the striking findings in this article, previous empirical results that adopt an ARCH approach in modeling monthly or lower frequency interest-rate dynamics are rendered questionable.  相似文献   

8.
Integro-difference equations (IDEs) provide a flexible framework for dynamic modeling of spatio-temporal data. The choice of kernel in an IDE model relates directly to the underlying physical process modeled, and it can affect model fit and predictive accuracy. We introduce Bayesian non-parametric methods to the IDE literature as a means to allow flexibility in modeling the kernel. We propose a mixture of normal distributions for the IDE kernel, built from a spatial Dirichlet process for the mixing distribution, which can model kernels with shapes that change with location. This allows the IDE model to capture non-stationarity with respect to location and to reflect a changing physical process across the domain. We address computational concerns for inference that leverage the use of Hermite polynomials as a basis for the representation of the process and the IDE kernel, and incorporate Hamiltonian Markov chain Monte Carlo steps in the posterior simulation method. An example with synthetic data demonstrates that the model can successfully capture location-dependent dynamics. Moreover, using a data set of ozone pressure, we show that the spatial Dirichlet process mixture model outperforms several alternative models for the IDE kernel, including the state of the art in the IDE literature, that is, a Gaussian kernel with location-dependent parameters.  相似文献   

9.
We consider stochastic volatility models that are defined by an Ornstein–Uhlenbeck (OU)-Gamma time change. These models are most suitable for modeling financial time series and follow the general framework of the popular non-Gaussian OU models of Barndorff-Nielsen and Shephard. One current problem of these otherwise attractive nontrivial models is, in general, the unavailability of a tractable likelihood-based statistical analysis for the returns of financial assets, which requires the ability to sample from a nontrivial joint distribution. We show that an OU process driven by an infinite activity Gamma process, which is an OU-Gamma process, exhibits unique features, which allows one to explicitly describe and exactly sample from relevant joint distributions. This is a consequence of the OU structure and the calculus of Gamma and Dirichlet processes. We develop a particle marginal Metropolis–Hastings algorithm for this type of continuous-time stochastic volatility models and check its performance using simulated data. For illustration we finally fit the model to S&P500 index data.  相似文献   

10.
Most applications in spatial statistics involve modeling of complex spatial–temporal dependency structures, and many of the problems of space and time modeling can be overcome by using separable processes. This subclass of spatial–temporal processes has several advantages, including rapid fitting and simple extensions of many techniques developed and successfully used in time series and classical geostatistics. In particular, a major advantage of these processes is that the covariance matrix for a realization can be expressed as the Kronecker product of two smaller matrices that arise separately from the temporal and purely spatial processes, and hence its determinant and inverse are easily determinable. However, these separable models are not always realistic, and there are no formal tests for separability of general spatial–temporal processes. We present here a formal method to test for separability. Our approach can be also used to test for lack of stationarity of the process. The beauty of our approach is that by using spectral methods the mechanics of the test can be reduced to a simple two-factor analysis of variance (ANOVA) procedure. The approach we propose is based on only one realization of the spatial–temporal process.We apply the statistical methods proposed here to test for separability and stationarity of spatial–temporal ozone fields using data provided by the US Environmental Protection Agency (EPA).  相似文献   

11.
The class of affine LIBOR models is appealing since it satisfies three central requirements of interest rate modeling. It is arbitrage-free, interest rates are nonnegative, and caplet and swaption prices can be calculated analytically. In order to guarantee nonnegative interest rates affine LIBOR models are driven by nonnegative affine processes, a restriction that makes it hard to produce volatility smiles. We modify the affine LIBOR models in such a way that real-valued affine processes can be used without destroying the nonnegativity of interest rates. Numerical examples show that in this class of models, pronounced volatility smiles are possible.  相似文献   

12.
We discuss a general approach to dynamic sparsity modeling in multivariate time series analysis. Time-varying parameters are linked to latent processes that are thresholded to induce zero values adaptively, providing natural mechanisms for dynamic variable inclusion/selection. We discuss Bayesian model specification, analysis and prediction in dynamic regressions, time-varying vector autoregressions, and multivariate volatility models using latent thresholding. Application to a topical macroeconomic time series problem illustrates some of the benefits of the approach in terms of statistical and economic interpretations as well as improved predictions. Supplementary materials for this article are available online.  相似文献   

13.
The main goal of this work is to generalize the autoregressive conditional duration (ACD) model applied to times between trades to the case of time-varying parameters. The use of wavelets allows that parameters vary through time and makes possible the modeling of non-stationary processes without preliminary data transformations. The time-varying ACD model estimation was done by maximum-likelihood with standard exponential distributed errors. The properties of the estimators were assessed via bootstrap. We present a simulation exercise for a non-stationary process and an empirical application to a real series, namely the TELEMAR stock. Diagnostic and goodness of fit analysis suggest that the time-varying ACD model simultaneously modeled the dependence between durations, intra-day seasonality and volatility.  相似文献   

14.
Risks are usually represented and measured by volatility-covolatility matrices. Wishart processes are models for a dynamic analysis of multivariate risk and describe the evolution of stochastic volatility-covolatility matrices, constrained to be symmetric positive definite. The autoregressive Wishart process (WAR) is the multivariate extension of the Cox, Ingersoll, Ross (CIR) process introduced for scalar stochastic volatility. As a CIR process it allows for closed-form solutions for a number of financial problems, such as term structure of T-bonds and corporate bonds, derivative pricing in a multivariate stochastic volatility model, and the structural model for credit risk. Moreover, the Wishart dynamics are very flexible and are serious competitors for less structural multivariate ARCH models.  相似文献   

15.
We discuss the development of dynamic factor models for multivariate financial time series, and the incorporation of stochastic volatility components for latent factor processes. Bayesian inference and computation is developed and explored in a study of the dynamic factor structure of daily spot exchange rates for a selection of international currencies. The models are direct generalizations of univariate stochastic volatility models and represent specific varieties of models recently discussed in the growing multivariate stochastic volatility literature. We discuss model fitting based on retrospective data and sequential analysis for forward filtering and short-term forecasting. Analyses are compared with results from the much simpler method of dynamic variance-matrix discounting that, for over a decade, has been a standard approach in applied financial econometrics. We study these models in analysis, forecasting, and sequential portfolio allocation for a selected set of international exchange-rate-return time series. Our goals are to understand a range of modeling questions arising in using these factor models and to explore empirical performance in portfolio construction relative to discount approaches. We report on our experiences and conclude with comments about the practical utility of structured factor models and on future potential model extensions.  相似文献   

16.
A measure is the formal representation of the non-negative additive functions that abound in science. We review and develop the art of assigning Bayesian priors to measures. Where necessary, spatial correlation is delegated to correlating kernels imposed on otherwise uncorrelated priors. The latter must be infinitely divisible (ID) and hence described by the Lévy–Khinchin representation. Thus the fundamental object is the Lévy measure, the choice of which corresponds to different ID process priors. The general case of a Lévy measure comprising a mixture of assigned base measures leads to a prior process comprising a convolution of corresponding processes. Examples involving a single base measure are the gamma process, the Dirichlet process (for the normalized case) and the Poisson process. We also discuss processes that we call the supergamma and super-Dirichlet processes, which are double base measure generalizations of the gamma and Dirichlet processes. Examples of multiple and continuum base measures are also discussed. We conclude with numerical examples of density estimation.  相似文献   

17.
In recent years, with the availability of high-frequency financial market data modeling realized volatility has become a new and innovative research direction. The construction of “observable” or realized volatility series from intra-day transaction data and the use of standard time-series techniques has lead to promising strategies for modeling and predicting (daily) volatility. In this article, we show that the residuals of commonly used time-series models for realized volatility and logarithmic realized variance exhibit non-Gaussianity and volatility clustering. We propose extensions to explicitly account for these properties and assess their relevance for modeling and forecasting realized volatility. In an empirical application for S&P 500 index futures we show that allowing for time-varying volatility of realized volatility and logarithmic realized variance substantially improves the fit as well as predictive performance. Furthermore, the distributional assumption for residuals plays a crucial role in density forecasting.  相似文献   

18.
Ordinary, modified, and equilibrium alternating renewal processes are defined in such a way that the distribution of the sojourn time in one state of a two-state system may be considered conditional on the starting state. The double Laplace transform of a non-negative stochastic process is also defined, and used to express a result of Takacs (1957) on sojourn time distributions.  相似文献   

19.
This paper provides a semiparametric framework for modeling multivariate conditional heteroskedasticity. We put forward latent stochastic volatility (SV) factors as capturing the commonality in the joint conditional variance matrix of asset returns. This approach is in line with common features as studied by Engle and Kozicki (1993), and it allows us to focus on identication of factors and factor loadings through first- and second-order conditional moments only. We assume that the time-varying part of risk premiums is based on constant prices of factor risks, and we consider a factor SV in mean model. Additional specification of both expectations and volatility of future volatility of factors provides conditional moment restrictions, through which the parameters of the model are all identied. These conditional moment restrictions pave the way for instrumental variables estimation and GMM inference.  相似文献   

20.
The Volatility of Realized Volatility   总被引:4,自引:1,他引:3  
In recent years, with the availability of high-frequency financial market data modeling realized volatility has become a new and innovative research direction. The construction of “observable” or realized volatility series from intra-day transaction data and the use of standard time-series techniques has lead to promising strategies for modeling and predicting (daily) volatility. In this article, we show that the residuals of commonly used time-series models for realized volatility and logarithmic realized variance exhibit non-Gaussianity and volatility clustering. We propose extensions to explicitly account for these properties and assess their relevance for modeling and forecasting realized volatility. In an empirical application for S&P 500 index futures we show that allowing for time-varying volatility of realized volatility and logarithmic realized variance substantially improves the fit as well as predictive performance. Furthermore, the distributional assumption for residuals plays a crucial role in density forecasting.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号