首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
It is an important problem to compare two time series in many applications. In this paper, a computational bootstrap procedure is proposed to test if two dependent stationary time series have the same autocovariance structures. The blocks of blocks bootstrap on bivariate time series is employed to estimate the covariance matrix which is necessary in order to construct the proposed test statistic. Without much additional effort, the bootstrap critical values can also be computed as a byproduct from the same bootstrap procedure. The asymptotic distribution of the test statistic under the null hypothesis is obtained. A simulation study is conducted to examine the finite sample performance of the test. The simulation results show that the proposed procedure with the bootstrap critical values performs well empirically and is especially useful when time series are short and non-normal. The proposed test is applied to an analysis of a real data set to understand the relationship between the input and output signals of a chemical process.  相似文献   

2.
In astronomy multiple images are frequently obtained at the same position of the sky for follow-up coaddition as it helps one go deeper and look for fainter objects. With large scale panchromatic synoptic surveys becoming more common, image co-addition has become even more necessary as new observations start to get compared with coadded fiducial sky in real time. The standard coaddition techniques have included straight averages, variance weighted averages, medians etc. A more sophisticated nonlinear response chi-square method is also used when it is known that the data are background noise limited and the point spread function is homogenized in all channels. A more robust object detection technique capable of detecting faint sources, even those not seen at all epochs which will normally be smoothed out in traditional methods, is described. The analysis at each pixel level is based on a formula similar to Mahalanobis distance.  相似文献   

3.
Time series data observed at unequal time intervals (irregular data) occur quite often and this usually poses problems in its analysis. A recursive form of the exponentially smoothed estimated is here proposed for a nonlinear model with irregularly observed data and its asymptotic properties are discussed An alternative smoother to that of Wright (1985) is also derived. Numerical comparison is made between the resulting estimates and other smoothed estimates.  相似文献   

4.
In estimating the population median, it is common to encounter estimators which are linear combinations of a small number of central observations. Sample medians, sample quasi medians, trimmed means, jackknifed (and delete‐d jackknifed) medians and jackknifed quasi medians are all familiar examples. The objective of this paper is to show that within this class the quasi medians turn out to have the best asymptotic mean squared error.  相似文献   

5.
This article examines structural change tests based on generalized empirical likelihood methods in the time series context, allowing for dependent data. Standard structural change tests for the Generalized method of moments (GMM) are adapted to the generalized empirical likelihood (GEL) context. We show that when moment conditions are properly smoothed, these test statistics converge to the same asymptotic distribution as in the GMM, in cases with known and unknown breakpoints. New test statistics specific to GEL methods, and that are robust to weak identification, are also introduced. A simulation study examines the small sample properties of the tests and reveals that GEL-based robust tests performed well, both in terms of the presence and location of a structural change and in terms of the nature of identification.  相似文献   

6.
Time series arising in practice often have an inherently irregular sampling structure or missing values, that can arise for example due to a faulty measuring device or complex time-dependent nature. Spectral decomposition of time series is a traditionally useful tool for data variability analysis. However, existing methods for spectral estimation often assume a regularly-sampled time series, or require modifications to cope with irregular or ‘gappy’ data. Additionally, many techniques also assume that the time series are stationary, which in the majority of cases is demonstrably not appropriate. This article addresses the topic of spectral estimation of a non-stationary time series sampled with missing data. The time series is modelled as a locally stationary wavelet process in the sense introduced by Nason et al. (J. R. Stat. Soc. B 62(2):271–292, 2000) and its realization is assumed to feature missing observations. Our work proposes an estimator (the periodogram) for the process wavelet spectrum, which copes with the missing data whilst relaxing the strong assumption of stationarity. At the centre of our construction are second generation wavelets built by means of the lifting scheme (Sweldens, Wavelet Applications in Signal and Image Processing III, Proc. SPIE, vol. 2569, pp. 68–79, 1995), designed to cope with irregular data. We investigate the theoretical properties of our proposed periodogram, and show that it can be smoothed to produce a bias-corrected spectral estimate by adopting a penalized least squares criterion. We demonstrate our method with real data and simulated examples.  相似文献   

7.
This paper defines and studies a new class of non-stationary random processes constructed from discrete non-decimated wavelets which generalizes the Cramér (Fourier) representation of stationary time series. We define an evolutionary wavelet spectrum (EWS) which quantifies how process power varies locally over time and scale. We show how the EWS may be rigorously estimated by a smoothed wavelet periodogram and how both these quantities may be inverted to provide an estimable time-localized autocovariance. We illustrate our theory with a pedagogical example based on discrete non-decimated Haar wavelets and also a real medical time series example.  相似文献   

8.
This paper presents a modified Whittaker–Henderson (WH) Method of Graduation. After giving a closed-form solution, we show that it is of practical use because it provides not only a smoothed series identical to that of the WH graduation, but also an extrapolation beyond the sample limit of current data. In addition, we introduce two other penalized least squares problems and show that they provide the same results as those of the modified WH graduation.  相似文献   

9.
We discuss moving window techniques for fast extraction of a signal composed of monotonic trends and abrupt shifts from a noisy time series with irrelevant spikes. Running medians remove spikes and preserve shifts, but they deteriorate in trend periods. Modified trimmed mean filters use a robust scale estimate such as the median absolute deviation about the median (MAD) to select an adaptive amount of trimming. Application of robust regression, particularly of the repeated median, has been suggested for improving upon the median in trend periods. We combine these ideas and construct modified filters based on the repeated median offering better shift preservation. All these filters are compared w.r.t. fundamental analytical properties and in basic data situations. An algorithm for the update of the MAD running in time O(log n) for window width n is presented as well.  相似文献   

10.
The usual covariance estimates for data n-1 from a stationary zero-mean stochastic process {Xt} are the sample covariances Both direct and resampling approaches are used to estimate the variance of the sample covariances. This paper compares the performance of these variance estimates. Using a direct approach, we show that a consistent windowed periodogram estimate for the spectrum is more effective than using the periodogram itself. A frequency domain bootstrap for time series is proposed and analyzed, and we introduce a frequency domain version of the jackknife that is shown to be asymptotically unbiased and consistent for Gaussian processes. Monte Carlo techniques show that the time domain jackknife and subseries method cannot be recommended. For a Gaussian underlying series a direct approach using a smoothed periodogram is best; for a non-Gaussian series the frequency domain bootstrap appears preferable. For small samples, the bootstraps are dangerous: both the direct approach and frequency domain jackknife are better.  相似文献   

11.
Some nonparametric methods have been proposed to compare survival medians. Most of them are based on the asymptotic null distribution to estimate the p-value. However, for small to moderate sample sizes, those tests may have inflated Type I error rate, which makes their application limited. In this article, we proposed a new nonparametric test that uses bootstrap to estimate the sample mean and variance of the median. Through comprehensive simulation, we show that the proposed approach can control Type I error rates well. A real data application is used to illustrate the use of the new test.  相似文献   

12.
This paper suggests a simple nonmetric method for smoothing time series data. The smoothed series is the closest polytone curve to the presmoothed series in terms of least sum of absolute deviations. The method is exemplified on several seasonally adjusted series in order to estimate their trend component.  相似文献   

13.
Permutation tests based on medians are examined for pairwise comparison of scale. Tests that have been found in the literature to be effective for comparing scale for two groups are extended to the case of all pairwise comparisons, using the Tukey-type adjustment of Richter and McCann [Multiple comparison of medians using permutation tests. J Mod Appl Stat Methods. 2007;6(2):399–412] to guarantee strong Type I error rate control. Power and Type I error rate estimates are computed using simulated data. A method based on the ratio of deviances performed best and appears to be the best overall test.  相似文献   

14.
首先对单位根检验的两类常见的数据生成系统进行比较,然后利用蒙特卡洛实验研究了时间序列单位根检验式的设定问题。研究发现在利用DF检验和DF-GLS检验进行时间序列的单位根检验时,检验式设定错误直接影响着检验结果,尤其在推断时间序列是趋势平稳过程还是有时间趋势项的随机游走过程或有二阶时间趋势多项式的随机游走过程时,检验式的错误设定很容易将趋势平稳过程误判为非平稳过程。  相似文献   

15.
The problem of testing the equality of the medians of several populations is considered. Standard distribution-free procedures for this problem require that the populations have the same shape in order to maintain their nominal significance level, ever asymptotically, under the null hypothesis of equal medians , A modification of the Kruskal-Wallis test statistic is proposed which is exactly distribution-free under the usual nonparanetric asswnption that the continuous populations are identical with any shape. It is asymptotically distribution-free when the Continuous populations are asswned to be syrmnetric with equal medians.  相似文献   

16.
ABSTRACT

We derive a statistical theory that provides useful asymptotic approximations to the distributions of the single inferences of filtered and smoothed probabilities, derived from time series characterized by Markov-switching dynamics. We show that the uncertainty in these probabilities diminishes when the states are separated, the variance of the shocks is low, and the time series or the regimes are persistent. As empirical illustrations of our approach, we analyze the U.S. GDP growth rates and the U.S. real interest rates. For both models, we illustrate the usefulness of the confidence intervals when identifying the business cycle phases and the interest rate regimes.  相似文献   

17.
In this article, we propose a new technique for constructing confidence intervals for the mean of a noisy sequence with multiple change-points. We use the weighted bootstrap to generalize the bootstrap aggregating or bagging estimator. A standard deviation formula for the bagging estimator is introduced, based on which smoothed confidence intervals are constructed. To further improve the performance of the smoothed interval for weak signals, we suggest a strategy of adaptively choosing between the percentile intervals and the smoothed intervals. A new intensity plot is proposed to visualize the pattern of the change-points. We also propose a new change-point estimator based on the intensity plot, which has superior performance in comparison with the state-of-the-art segmentation methods. The finite sample performance of the confidence intervals and the change-point estimator are evaluated through Monte Carlo studies and illustrated with a real data example.  相似文献   

18.
Time series smoothers estimate the level of a time series at time t as its conditional expectation given present, past and future observations, with the smoothed value depending on the estimated time series model. Alternatively, local polynomial regressions on time can be used to estimate the level, with the implied smoothed value depending on the weight function and the bandwidth in the local linear least squares fit. In this article we compare the two smoothing approaches and describe their similarities. Through simulations, we assess the increase in the mean square error that results when approximating the estimated optimal time series smoother with the local regression estimate of the level.  相似文献   

19.
The authors examine the equivalence between penalized least squares and state space smoothing using random vectors with infinite variance. They show that despite infinite variance, many time series techniques for estimation, significance testing, and diagnostics can be used. The Kalman filter can be used to fit penalized least squares models, computing the smoothed quantities and related values. Infinite variance is equivalent to differencing to stationarity, and to adding explanatory variables. The authors examine constructs called “smoothations” which they show to be fundamental in smoothing. Applications illustrate concepts and methods.  相似文献   

20.
In this article, variance stabilizing filters are discussed. A new filter with nice properties is proposed which makes use of moving averages and moving standard deviations, the latter smoothed with the Hodrick-Prescott filter. This filter is compared to a GARCH-type filter. An ARIMA model is estimated for the filtered GDP series, and the parameter estimates are used in forecasting the unfiltered series. These forecasts compare well with those of ARIMA, ARFIMA, and GARCH models based on the unfiltered data. The filter does not color white noise.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号