首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
This paper deals with hypothesis testing for independent time series with unequal length. It proposes a spectral test based on the distance between the periodogram ordinates and a parametric test based on the distance between the parameter estimates of fitted autoregressive moving average models. Both tests are compared with a likelihood ratio test based on the pooled spectra. In all cases, the null hypothesis is that the two series under consideration are generated by the same stochastic process. The performance of the three tests is investigated by a Monte Carlo simulation study.  相似文献   

2.
A frequency domain bootstrap (FDB) is a common technique to apply Efron’s independent and identically distributed resampling technique (Efron, 1979) to periodogram ordinates – especially normalized periodogram ordinates – by using spectral density estimates. The FDB method is applicable to several classes of statistics, such as estimators of the normalized spectral mean, the autocorrelation (but not autocovariance), the normalized spectral density function, and Whittle parameters. While this FDB method has been extensively studied with respect to short-range dependent time processes, there is a dearth of research on its use with long-range dependent time processes. Therefore, we propose an FDB methodology for ratio statistics under long-range dependence, using semi- and nonparametric spectral density estimates as a normalizing factor. It is shown that the FDB approximation allows for valid distribution estimation for a broad class of stationary, long-range (or short-range) dependent linear processes, without any stringent assumptions on the distribution of the underlying process. The results of a large simulation study show that the FDB approximation using a semi- or nonparametric spectral density estimator is often robust for various values of a long-memory parameter reflecting magnitude of dependence. We apply the proposed procedure to two data examples.  相似文献   

3.
Growth hormone plasma concentrations vary rhythmically between high and low values. Radioimmunoassay measurements of low values are often indistinguishable from low controls, and are reported as a censored value, the 'minimum detectable dose'. This paper reports such a dataset from a designed experiment with about 60% of the values censored but large distinct signals for the remainder of the data. The ordinates of the average periodogram for each treatment group are independently gamma distributed, with distribution depending on the underlying spectrum and the replication for that group. This situation can lead to an analysis for common spectral shape using a gamma generalized linear model with log link, and the hypothesis of common spectral shape is rejected here. Since such a level of censoring reduces the variance of each profile, the periodogram, which is a partition of the variance, is also reduced in overall magnitude. A simulation study shows that this reduction is not necessarily uniform over the frequency domain, but may be more pronounced at lower or higher ordinates depending on the underlying model. Therefore it is possible that the rejection of common spectral shape is an artefact of the censoring.  相似文献   

4.
The usual covariance estimates for data n-1 from a stationary zero-mean stochastic process {Xt} are the sample covariances Both direct and resampling approaches are used to estimate the variance of the sample covariances. This paper compares the performance of these variance estimates. Using a direct approach, we show that a consistent windowed periodogram estimate for the spectrum is more effective than using the periodogram itself. A frequency domain bootstrap for time series is proposed and analyzed, and we introduce a frequency domain version of the jackknife that is shown to be asymptotically unbiased and consistent for Gaussian processes. Monte Carlo techniques show that the time domain jackknife and subseries method cannot be recommended. For a Gaussian underlying series a direct approach using a smoothed periodogram is best; for a non-Gaussian series the frequency domain bootstrap appears preferable. For small samples, the bootstraps are dangerous: both the direct approach and frequency domain jackknife are better.  相似文献   

5.
The paper makes two contributions. First, we provide a formula for the exact distribution of the periodogram evaluated at any arbitrary frequency, when the sample is taken from any zero-mean stationary Gaussian process. The inadequacy of the asymptotic distribution is demonstrated through an example in which the observations are generated by a fractional Gaussian noise process. The results are then applied in deriving the exact bias of the log-periodogram regression estimator (Geweke and Porter-Hudak (1983), Robinson (1995)). The formula is computable. Practical bounds on this bias are developed and their arithmetic mean is shown to be accurate and useful.  相似文献   

6.
《Econometric Reviews》2013,32(3):369-383
The paper makes two contributions. First, we provide a formula for the exact distribution of the periodogram evaluated at any arbitrary frequency, when the sample is taken from any zero-mean stationary Gaussian process. The inadequacy of the asymptotic distribution is demonstrated through an example in which the observations are generated by a fractional Gaussian noise process. The results are then applied in deriving the exact bias of the log-periodogram regression estimator (Geweke and Porter-Hudak (1983), Robinson (1995)). The formula is computable. Practical bounds on this bias are developed and their arithmetic mean is shown to be accurate and useful.  相似文献   

7.
Maximum likelihood estimates (MLEs) for logistic regression coefficients are known to be biased in finite samples and consequently may produce misleading inferences. Bias adjusted estimates can be calculated using the first-order asymptotic bias derived from a Taylor series expansion of the log likelihood. Jackknifing can also be used to obtain bias corrected estimates, but the approach is computationally intensive, requiring an additional series of iterations (steps) for each observation in the dataset.Although the one-step jackknife has been shown to be useful in logistic regression diagnostics and i the estimation of classification error rates, it does not effectively reduce bias. The two-step jackknife, however, can reduce computation in moderate-sized samples, provide estimates of dispersion and classification error, and appears to be effective in bias reduction. Another alternative, a two-step closed-form approximation, is found to be similar to the Taylo series method in certain circumstances. Monte Carlo simulations indicate that all the procedures, but particularly the multi-step jackknife, may tend to over-correct in very small samples. Comparison of the various bias correction proceduresin an example from the medical literature illustrates that bias correction can have a considerable impact on inference  相似文献   

8.
This paper deals with speed of convergence to the normal distribution of the distribution of parameter estimates considered by Whittle and Walker for stationary Gaussian random sequences. The result obtained is based on an estimation of the speed of convergence for the distribution of an integrated periodogram.  相似文献   

9.
10.
ABSTRACT

A common method for estimating the time-domain parameters of an autoregressive process is to use the Yule–Walker equations. Tapering has been shown intuitively and proven theoretically to reduce the bias of the periodogram in the frequency domain, but the intuition for the similar bias reduction in the time-domain estimates has been lacking. We provide insightful reasoning for why tapering reduces the bias in the Yule–Walker estimates by showing them to be equivalent to a weighted least-squares problem. This leads to the derivation of an optimal taper which behaves similarly to commonly used tapers.  相似文献   

11.
Time series with cyclical long memory are characterized by a spectral pole at some frequency ω between 0 and π such that the series has a persistent cycle of period 2π/ω, implying a quasi-periodic behaviour that slightly evolves with time. Accurate estimation of ω is needed for a precise determination of the characteristic of the series (e.g. for business cycle determination or signal estimation). We propose a simple iterative algorithm of estimation of ω based on the maximizer of the periodogram evaluated at an increasingly finer grid of frequencies and compare its performance with more usual methods of estimation restricted to Fourier frequencies. We also apply this technique to the estimation of the frequency of the sunspot index and the business cycle of the differenced unemployment level of the USA.  相似文献   

12.
Time series arising in practice often have an inherently irregular sampling structure or missing values, that can arise for example due to a faulty measuring device or complex time-dependent nature. Spectral decomposition of time series is a traditionally useful tool for data variability analysis. However, existing methods for spectral estimation often assume a regularly-sampled time series, or require modifications to cope with irregular or ‘gappy’ data. Additionally, many techniques also assume that the time series are stationary, which in the majority of cases is demonstrably not appropriate. This article addresses the topic of spectral estimation of a non-stationary time series sampled with missing data. The time series is modelled as a locally stationary wavelet process in the sense introduced by Nason et al. (J. R. Stat. Soc. B 62(2):271–292, 2000) and its realization is assumed to feature missing observations. Our work proposes an estimator (the periodogram) for the process wavelet spectrum, which copes with the missing data whilst relaxing the strong assumption of stationarity. At the centre of our construction are second generation wavelets built by means of the lifting scheme (Sweldens, Wavelet Applications in Signal and Image Processing III, Proc. SPIE, vol. 2569, pp. 68–79, 1995), designed to cope with irregular data. We investigate the theoretical properties of our proposed periodogram, and show that it can be smoothed to produce a bias-corrected spectral estimate by adopting a penalized least squares criterion. We demonstrate our method with real data and simulated examples.  相似文献   

13.
This paper tackles the issue of economic time-series modeling from a joint time and frequency-domain standpoint, with the objective of estimating the latent trend-cycle component. Since time-series records are data strings over a finite time span, they read as samples of contiguous data drawn from realizations of stochastic processes aligned with the time arrow. This accounts for the interpretation of time series as time-limited signals. Economic time series (up to a disturbance term) result from latent components known as trend, cycle, and seasonality, whose generating stochastic processes are harmonizable on a finite average-power argument. In addition, since trend is associated with long-run regular movements, and cycle with medium-term economic fluctuation, both of these turn out to be band-limited components. Recognizing such a frequency-domain location permits a filter-based approach to component estimation. This is accomplished through a Toeplitz matrix operator with sinc functions as entries, mirroring the ideal low-pass filter impulse response. The notion of virtual transfer function is developed and its closed-form expression derived in order to evaluate the filter features. The paper is completed by applying this filter to quarterly data from Italian industrial production, thus shedding light on the performance of the estimation procedure.  相似文献   

14.
This paper defines and studies a new class of non-stationary random processes constructed from discrete non-decimated wavelets which generalizes the Cramér (Fourier) representation of stationary time series. We define an evolutionary wavelet spectrum (EWS) which quantifies how process power varies locally over time and scale. We show how the EWS may be rigorously estimated by a smoothed wavelet periodogram and how both these quantities may be inverted to provide an estimable time-localized autocovariance. We illustrate our theory with a pedagogical example based on discrete non-decimated Haar wavelets and also a real medical time series example.  相似文献   

15.
ABSTRACT

Non-stationarity in bivariate time series of counts may be induced by a number of time-varying covariates affecting the bivariate responses due to which the innovation terms of the individual series as well as the bivariate dependence structure becomes non-stationary. So far, in the existing models, the innovation terms of individual INAR(1) series and the dependence structure are assumed to be constant even though the individual time series are non-stationary. Under this assumption, the reliability of the regression and correlation estimates is questionable. Besides, the existing estimation methodologies such as the conditional maximum likelihood (CMLE) and the composite likelihood estimation are computationally intensive. To address these issues, this paper proposes a BINAR(1) model where the innovation series follow a bivariate Poisson distribution under some non-stationary distributional assumptions. The method of generalized quasi-likelihood (GQL) is used to estimate the regression effects while the serial and bivariate correlations are estimated using a robust moment estimation technique. The application of model and estimation method is made in the simulated data. The GQL method is also compared with the CMLE, generalized method of moments (GMM) and generalized estimating equation (GEE) approaches where through simulation studies, it is shown that GQL yields more efficient estimates than GMM and equally or slightly more efficient estimates than CMLE and GEE.  相似文献   

16.
The modelling of discrete such as binary time series, unlike the continuous time series, is not easy. This is due to the fact that there is no unique way to model the correlation structure of the repeated binary data. Some models may also provide a complicated correlation structure with narrow ranges for the correlations. In this paper, we consider a nonlinear dynamic binary time series model that provides a correlation structure which is easy to interpret and the correlations under this model satisfy the full?1 to 1 range. For the estimation of the parameters of this nonlinear model, we use a conditional generalized quasilikelihood (CGQL) approach which provides the same estimates as those of the well-known maximum likelihood approach. Furthermore, we consider a competitive linear dynamic binary time series model and examine the performance of the CGQL approach through a simulation study in estimating the parameters of this linear model. The model mis-specification effects on estimation as well as forecasting are also examined through simulations.  相似文献   

17.
Estimation of a nonparametric regression spectrum based on the periodogram is considered. Neither trend estimation nor smoothing of the periodogram is required. Alternatively, for cases where spectral estimation of phase shifts fails and the shift does not depend on frequency, a time domain estimator of the lag-shift is defined. Asymptotic properties of the frequency and time domain estimators are derived. Simulations and a data example illustrate the methods.  相似文献   

18.
In this article, we assess Bayesian estimation and prediction using integrated Laplace approximation (INLA) on a stochastic volatility (SV) model. This was performed through a Monte Carlo study with 1,000 simulated time series. To evaluate the estimation method, two criteria were considered: the bias and square root of the mean square error (smse). The criteria used for prediction are the one step ahead forecast of volatility and the one day Value at Risk (VaR). The main findings are that the INLA approximations are fairly accurate and relatively robust to the choice of prior distribution on the persistence parameter. Additionally, VaR estimates are computed and compared for three financial time series returns indexes.  相似文献   

19.
Cut-off sampling consists of deliberately excluding a set of units from possible selection in a sample, for example if the contribution of the excluded units to the total is small or if the inclusion of these units in the sample involves high costs. If the characteristics of interest of the excluded units differ from those of the rest of the population, the use of naïve estimators may result in highly biased estimates. In this paper, we discuss the use of auxiliary information to reduce the bias by means of calibration and balanced sampling. We show that the use of the available auxiliary information related to both the variable of interest and the probability of being excluded enables us to reduce the potential bias. A short numerical study supports our findings.  相似文献   

20.
In Monte Carlo sudies we investigate unit root tests in line with Dickey/Fuller (1979). In case of positively autocorrelated MA(1) residuals their experimental power is extremely poor. Next we compare different versions of periodogram regression suggested in the literature. Their experimental behaviour is investigated with fractionally integrated processes. It is demonstrated how unit root tests may be based on periodogram regression. There is simulation evidence that those tests may do better in terms of power than the autoregressive tests, especially when testing ARMA(1,1) series against a linear time trend.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号