首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
We carry out finite sample size parameter estimation methods for long-memory parameters of the class of seasonal fractional ARIMA with stable innovations. In particular, we consider the semiparametric method studied in Reisen et al. (2006) [27] and two Whittle approaches: the classical Whittle method and a method based on a Markov Chains Monte Carlo (MCMC) procedure. The performance of the methods is discussed using a Monte Carlo simulation.  相似文献   

2.
Editorial     
This paper uses the empirical characteristic function (ECF) procedure to estimate the parameters of mixtures of normal distributions. Since the characteristic function is uniformly bounded, the procedure gives estimates that are numerically stable. It is shown that, using Monte Carlo simulation, the finite sample properties of th ECF estimator are very good, even in the case where the popular maximum likelihood estimator fails to exist. An empirical application is illustrated using the monthl excess return of the Nyse value-weighted index.  相似文献   

3.
This paper uses the empirical characteristic function (ECF) procedure to estimate the parameters of mixtures of normal distributions. Since the characteristic function is uniformly bounded, the procedure gives estimates that are numerically stable. It is shown that, using Monte Carlo simulation, the finite sample properties of th ECF estimator are very good, even in the case where the popular maximum likelihood estimator fails to exist. An empirical application is illustrated using the monthl excess return of the Nyse value-weighted index.  相似文献   

4.
We consider a generalized exponential (GEXP) model in the frequency domain for modeling seasonal long-memory time series. This model generalizes the fractional exponential (FEXP) model [Beran, J., 1993. Fitting long-memory models by generalized linear regression. Biometrika 80, 817–822] to allow the singularity in the spectral density occurring at an arbitrary frequency for modeling persistent seasonality and business cycles. Moreover, the short-memory structure of this model is characterized by the Bloomfield [1973. An exponential model for the spectrum of a scalar time series. Biometrika 60, 217–226] model, which has a fairly flexible semiparametric form. The proposed model includes fractionally integrated processes, Bloomfield models, FEXP models as well as GARMA models [Gray, H.L., Zhang, N.-F., Woodward, W.A., 1989. On generalized fractional processes. J. Time Ser. Anal. 10, 233–257] as special cases. We develop a simple regression method for estimating the seasonal long-memory parameter. The asymptotic bias and variance of the corresponding long-memory estimator are derived. Our methodology is applied to a sunspot data set and an Internet traffic data set for illustration.  相似文献   

5.
For estimation of time-varying coefficient longitudinal models, the widely used local least-squares (LS) or covariance-weighted local LS smoothing uses information from the local sample average. Motivated by the fact that a combination of multiple quantiles provides a more complete picture of the distribution, we investigate quantile regression-based methods to improve efficiency by optimally combining information across quantiles. Under the working independence scenario, the asymptotic variance of the proposed estimator approaches the Cramér–Rao lower bound. In the presence of dependence among within-subject measurements, we adopt a prewhitening technique to transform regression errors into independent innovations and show that the prewhitened optimally weighted quantile average estimator asymptotically achieves the Cramér–Rao bound for the independent innovations. Fully data-driven bandwidth selection and optimal weights estimation are implemented through a two-step procedure. Monte Carlo studies show that the proposed method delivers more robust and superior overall performance than that of the existing methods.  相似文献   

6.
We consider the estimation of a change point or discontinuity in a regression function for random design model with long memory errors. We provide several change-point estimators and investigate the consistency of the estimators. Using the fractional ARIMA process as an example of long memory process, we report a small Monte Carlo experiment to compare the performance of the estimators in finite samples. We finish by applying the method to a climatological data example.  相似文献   

7.
This paper considers a semiparametric estimation of the memory parameter in a cyclical long-memory time series, which exhibits a strong dependence on cyclical behaviour, using the Whittle likelihood based on generalised exponential (GEXP) models. The proposed estimation is included in the so-called broadband or global method and uses information from the spectral density at all frequencies. We establish the consistency and the asymptotic normality of the estimated memory parameter for a linear process and thus do not require Gaussianity. A simulation study conducted using Monte Carlo experiments shows that the proposed estimation works well compared to other existing semiparametric estimations. Moreover, we provide an empirical application of the proposed estimation, applying it to the growth rate of Japan's industrial production index and detecting its cyclical persistence.  相似文献   

8.
We consider the problem of modelling a long-memory time series using piecewise fractional autoregressive integrated moving average processes. The number as well as the locations of structural break points (BPs) and the parameters of each regime are assumed to be unknown. A four-step procedure is proposed to find out the BPs and to estimate the parameters of each regime. Its effectiveness is shown by Monte Carlo simulations and an application to real traffic data modelling is considered.  相似文献   

9.
In this article, we discuss the parameter estimation for a k-factor generalized long-memory process with conditionally heteroskedastic noise. Two estimation methods are proposed. The first method is based on the conditional distribution of the process and the second is obtained as an extension of Whittle's estimation approach. For comparison purposes, Monte Carlo simulations are used to evaluate the finite sample performance of these estimation techniques, using four different conditional distribution functions.  相似文献   

10.
We consider a k-GARMA generalization of the long-memory stochastic volatility model, discuss the properties of the model and propose a wavelet-based Whittle estimator for its parameters. Its consistency is shown. Monte Carlo experiments show that the small sample properties are essentially indistinguishable from those of the Whittle estimator, but are favorable with respect to a wavelet-based approximate maximum likelihood estimator. An application is given for the Microsoft Corporation stock, modeling the intraday seasonal patterns of its realized volatility.  相似文献   

11.
This paper develops the Bayesian estimation for the Birnbaum–Saunders distribution based on Type-II censoring in the simple step stress–accelerated life test with power law accelerated form. Maximum likelihood estimates are obtained and Gibbs sampling procedure is used to get the Bayesian estimates for shape parameter of Birnbaum–Saunders distribution and parameters of power law–accelerated model. Asymptotic normality method and Markov Chain Monte Carlo method are employed to construct the corresponding confidence interval and highest posterior density interval at different confidence level, respectively. At last, the results are compared by using Monte Carlo simulations, and a numerical example is analyzed for illustration.  相似文献   

12.
Heston's model and Bates’ model are very important in option pricing. It is mentioned in Mendoza's paper [Bayesian estimation and option mispricing (job market paper). Cambridge, MA: Massachusetts Institute of Technology; 2011] that Mexican Stock Exchange introduced options over its main index (the Índice de Precios y Cotizaciones) in 2004 which used Heston's model to price options on days when there was no trading. The estimation of the parameters in both models is not easy. One of the methods is Markov chain Monte Carlo algorithm (MCMC for short). In this paper, we adopt Li, Wells and Yu's MCMC algorithm [A Bayesian analysis of return dynamics with levy jumps. Rev Financ Stud. 2008;21(5):2345–2377]. We provide the necessary derivation utilizing prior distributions since they are otherwise unavailable in the literature. As Li et al. used their model to analyse S&P 500 data from 2 January 1980 to 29 December 2000, we likewise recreate their analysis, this time using data from 1987 to 2012. We would like to involve the financial crisis and analyse how stable the method is while applying to the financial crisis. Unlike Li et al., we find that the estimation is very sensitive to the prior distribution assumption. In addition, we have R-code available by request. We hope to offer tools for people doing empirical research in financial mathematics or quantitative finance.  相似文献   

13.
14.
In this article, we develop the theory of k-factor Gegenbauer Autoregressive Moving Average (GARMA) process with infinite variance innovations which is a generalization of the stable seasonal fractional Autoregressive Integrated Moving Average (ARIMA) model introduced by Diongue et al. (2008 Diongue, A.K., Guégan, D. (2008). Estimation of k-Factor GIGARCH Process: A Monte Carlo Study. Communications in Statistics-Simulation and Computation 37:20372049.[Taylor &; Francis Online], [Web of Science ®] [Google Scholar]). Stationarity and invertibility conditions of this new model are derived. Conditional Sum of Squares (CSS) and Markov Chains Monte Carlo (MCMC) Whittle methods are investigated for parameter estimation. Monte Carlo simulations are also used to evaluate the finite sample performance of these estimation techniques. Finally, the usefulness of the model is corroborated with the application to streamflow data for Senegal River at Bakel.  相似文献   

15.
In this paper, we introduce the class of beta seasonal autoregressive moving average (βSARMA) models for modelling and forecasting time series data that assume values in the standard unit interval. It generalizes the class of beta autoregressive moving average models [Rocha AV and Cribari-Neto F. Beta autoregressive moving average models. Test. 2009;18(3):529–545] by incorporating seasonal dynamics to the model dynamic structure. Besides introducing the new class of models, we develop parameter estimation, hypothesis testing inference, and diagnostic analysis tools. We also discuss out-of-sample forecasting. In particular, we provide closed-form expressions for the conditional score vector and for the conditional Fisher information matrix. We also evaluate the finite sample performances of conditional maximum likelihood estimators and white noise tests using Monte Carlo simulations. An empirical application is presented and discussed.  相似文献   

16.
This paper introduces a multivariate long-memory model with structural breaks. In the proposed framework, the time series exhibits possibly fractional orders of integration which are allowed to be different in each subsample. The break date is endogenously determined using a procedure that minimizes the residual sum of squares (RSS). Monte Carlo experiments show that this method for detecting breaks performs well in large samples. As an illustration, we estimate a trivariate VAR including prices, employment and GDP in both the US and Mexico. For the subsample preceding the break, our findings are similar to those of earlier studies based on a standard VAR approach in both the countries, such that the variables exhibit integer degrees of integration. On the contrary, the series is found to be fractionally integrated after the break, with the fractional differencing parameters being higher than one in the case of Mexico.  相似文献   

17.
This paper demonstrates that cross-validation (CV) and Bayesian adaptive bandwidth selection can be applied in the estimation of associated kernel discrete functions. This idea is originally proposed by Brewer [A Bayesian model for local smoothing in kernel density estimation, Stat. Comput. 10 (2000), pp. 299–309] to derive variable bandwidths in adaptive kernel density estimation. Our approach considers the adaptive binomial kernel estimator and treats the variable bandwidths as parameters with beta prior distribution. The best variable bandwidth selector is estimated by the posterior mean in the Bayesian sense under squared error loss. Monte Carlo simulations are conducted to examine the performance of the proposed Bayesian adaptive approach in comparison with the performance of the Asymptotic mean integrated squared error estimator and CV technique for selecting a global (fixed) bandwidth proposed in Kokonendji and Senga Kiessé [Discrete associated kernels method and extensions, Stat. Methodol. 8 (2011), pp. 497–516]. The Bayesian adaptive bandwidth estimator performs better than the global bandwidth, in particular for small and moderate sample sizes.  相似文献   

18.
There has recently been growing interest in modeling and estimating alternative continuous time multivariate stochastic volatility models. We propose a continuous time fractionally integrated Wishart stochastic volatility (FIWSV) process, and derive the conditional Laplace transform of the FIWSV model in order to obtain a closed form expression of moments. A two-step procedure is used, namely estimating the parameter of fractional integration via the local Whittle estimator in the first step, and estimating the remaining parameters via the generalized method of moments in the second step. Monte Carlo results for the procedure show a reasonable performance in finite samples. The empirical results for the S&P 500 and FTSE 100 indexes show that the data favor the new FIWSV process rather than the one-factor and two-factor models of the Wishart autoregressive process for the covariance structure.  相似文献   

19.
Approximate normality and unbiasedness of the maximum likelihood estimate (MLE) of the long-memory parameter H of a fractional Brownian motion hold reasonably well for sample sizes as small as 20 if the mean and scale parameter are known. We show in a Monte Carlo study that if the latter two parameters are unknown the bias and variance of the MLE of H both increase substantially. We also show that the bias can be reduced by using a parametric bootstrap procedure. In very large samples, maximum likelihood estimation becomes problematic because of the large dimension of the covariance matrix that must be inverted. To overcome this difficulty, we propose a maximum likelihood method based upon first differences of the data. These first differences form a short-memory process. We split the data into a number of contiguous blocks consisting of a relatively small number of observations. Computation of the likelihood function in a block then presents no computational problem. We form a pseudo-likelihood function consisting of the product of the likelihood functions in each of the blocks and provide a formula for the standard error of the resulting estimator of H. This formula is shown in a Monte Carlo study to provide a good approximation to the true standard error. The computation time required to obtain the estimate and its standard error from large data sets is an order of magnitude less than that required to obtain the widely used Whittle estimator. Application of the methodology is illustrated on two data sets.  相似文献   

20.
This paper introduces a new shrinkage estimator for the negative binomial regression model that is a generalization of the estimator proposed for the linear regression model by Liu [A new class of biased estimate in linear regression, Comm. Stat. Theor. Meth. 22 (1993), pp. 393–402]. This shrinkage estimator is proposed in order to solve the problem of an inflated mean squared error of the classical maximum likelihood (ML) method in the presence of multicollinearity. Furthermore, the paper presents some methods of estimating the shrinkage parameter. By means of Monte Carlo simulations, it is shown that if the Liu estimator is applied with these shrinkage parameters, it always outperforms ML. The benefit of the new estimation method is also illustrated in an empirical application. Finally, based on the results from the simulation study and the empirical application, a recommendation regarding which estimator of the shrinkage parameter that should be used is given.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号