首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In this article we consider the problem of detecting changes in level and trend in time series model in which the number of change-points is unknown. The approach of Bayesian stochastic search model selection is introduced to detect the configuration of changes in a time series. The number and positions of change-points are determined by a sequence of change-dependent parameters. The sequence is estimated by its posterior distribution via the maximum a posteriori (MAP) estimation. Markov chain Monte Carlo (MCMC) method is used to estimate posterior distributions of parameters. Some actual data examples including a time series of traffic accidents and two hydrological time series are analyzed.  相似文献   

2.
ABSTRACT

The most common measure of dependence between two time series is the cross-correlation function. This measure gives a complete characterization of dependence for two linear and jointly Gaussian time series, but it often fails for nonlinear and non-Gaussian time series models, such as the ARCH-type models used in finance. The cross-correlation function is a global measure of dependence. In this article, we apply to bivariate time series the nonlinear local measure of dependence called local Gaussian correlation. It generally works well also for nonlinear models, and it can distinguish between positive and negative local dependence. We construct confidence intervals for the local Gaussian correlation and develop a test based on this measure of dependence. Asymptotic properties are derived for the parameter estimates, for the test functional and for a block bootstrap procedure. For both simulated and financial index data, we construct confidence intervals and we compare the proposed test with one based on the ordinary correlation and with one based on the Brownian distance correlation. Financial indexes are examined over a long time period and their local joint behavior, including tail behavior, is analyzed prior to, during and after the financial crisis. Supplementary material for this article is available online.  相似文献   

3.
The curve of correlation is a measure of local correlation between two random variables X and Y at the point X = x of the support of this variable. This article studies this local measure using the theory of time series for bivariate and univariate stationary stochastic process. We suggest local polynomial estimators for time series observing their consistency both theoretically and through simulations. For this, different sizes of series, bandwidths, and kernels, besides lags and models’ configurations were used. Applications have also been made using the daily returns of two financial series.  相似文献   

4.
Given a multiple time series sharing common autoregressive patterns, we estimate an additive model. The autoregressive component and the individual random effects are estimated by integrating maximum likelihood estimation and best linear unbiased predictions in a backfitting algorithm. The simulation study illustrated that the estimation procedure provides an alternative to the Arellano–Bond generalized method of moments (GMM) estimator of the panel model when T > N and the Arellano–Bond generally diverges. The estimator has high predictive ability. In cases where T ≤ N, the backfitting estimator is at least comparable to Arellano–Bond estimator.  相似文献   

5.
The dimension reduction in regression is an efficient method of overcoming the curse of dimensionality in non-parametric regression. Motivated by recent developments for dimension reduction in time series, an empirical extension of central mean subspace in time series to a single-input transfer function model is performed in this paper. Here, we use central mean subspace as a tool of dimension reduction for bivariate time series in the case when the dimension and lag are known and estimate the central mean subspace through the Nadaraya–Watson kernel smoother. Furthermore, we develop a data-dependent approach based on a modified Schwarz Bayesian criterion to estimate the unknown dimension and lag. Finally, we show that the approach in bivariate time series works well using an expository demonstration, two simulations, and a real data analysis such as El Niño and fish Population.  相似文献   

6.
This paper suggests a simple nonmetric method for smoothing time series data. The smoothed series is the closest polytone curve to the presmoothed series in terms of least sum of absolute deviations. The method is exemplified on several seasonally adjusted series in order to estimate their trend component.  相似文献   

7.
We consider the specific transformation of a Wiener process {X(t), t ≥ 0} in the presence of an absorbing barrier a that results when this process is “time-locked” with respect to its first passage time T a through a criterion level a, and the evolution of X(t) is considered backwards (retrospectively) from T a . Formally, we study the random variables defined by Y(t) ≡ X(T a  ? t) and derive explicit results for their density and mean, and also for their asymptotic forms. We discuss how our results can aid interpretations of time series “response-locked” to their times of crossing a criterion level.  相似文献   

8.
Two structural time series models for annual observations are constructed in terms of trend, cycle, and irregular components. The models are then estimated via the Kalman filter using data on five U.S. macroeconomic time series. The results provide some interesting insights into the dynamic structure of the series, particularly with respect to cyclical behavior. At the same time, they illustrate the development of a model selection strategy for structural time series models.  相似文献   

9.
Many time series encountered in practice are nonstationary, and instead are often generated from a process with a unit root. Because of the process of data collection or the practice of researchers, time series used in analysis and modeling are frequently obtained through temporal aggregation. As a result, the series used in testing for a unit root are often time series aggregates. In this paper, we study the effects of the use of aggregate time series on the Dickey–Fuller test for a unit root. We start by deriving a proper model for the aggregate series. Based on this model, we find the limiting distributions of the test statistics and illustrate how the tests are affected by the use of aggregate time series. The results show that those distributions shift to the right and that this effect increases with the order of aggregation, causing a strong impact both on the empirical significance level and on the power of the test. To correct this problem, we present tables of critical points appropriate for the tests based on aggregate time series and demonstrate their adequacy. Examples illustrate the conclusions of our analysis.  相似文献   

10.
In this paper we present an indirect estimation procedure for (ARFIMA) fractional time series models.The estimation method is based on an ‘incorrect’criterion which does not directly provide a consistent estimator of the parameters of interest,but leads to correct inference by using simulations.

The main steps are the following. First,we consider an auxiliary model which can be easily estimated.Specifically,we choose the finite lag Autoregressive model.Then, this is estimated on the observations and simulated values drawn from the ARFIMA model associated with a given value of the parameters of interest.Finally,the latter is calibrated in order to obtain close values of the two estimators of the auxiliary parameters.

In this article,we describe the estimation procedure and compare the performance of the indirect estimator with some alternative estimators based on the likelihood function by a Monte Carlo study.  相似文献   

11.
Time series regression models have been widely studied in the literature by several authors. However, statistical analysis of replicated time series regression models has received little attention. In this paper, we study the application of the quasi-least squares method to estimate the parameters in a replicated time series model with errors that follow an autoregressive process of order p. We also discuss two other established methods for estimating the parameters: maximum likelihood assuming normality and the Yule-Walker method. When the number of repeated measurements is bounded and the number of replications n goes to infinity, the regression and the autocorrelation parameters are consistent and asymptotically normal for all three methods of estimation. Basically, the three methods estimate the regression parameter efficiently and differ in how they estimate the autocorrelation. When p=2, for normal data we use simulations to show that the quasi-least squares estimate of the autocorrelation is undoubtedly better than the Yule-Walker estimate. And the former estimate is as good as the maximum likelihood estimate almost over the entire parameter space.  相似文献   

12.
Computational methods for local regression   总被引:1,自引:0,他引:1  
Local regression is a nonparametric method in which the regression surface is estimated by fitting parametric functions locally in the space of the predictors using weighted least squares in a moving fashion similar to the way that a time series is smoothed by moving averages. Three computational methods for local regression are presented. First, fast surface fitting and evaluation is achieved by building ak-d tree in the space of the predictors, evaluating the surface at the corners of the tree, and then interpolating elsewhere by blending functions. Second, surfaces are made conditionally parametric in any proper subset of the predictors by a simple alteration of the weighting scheme. Third degree-of-freedom quantities that would be extremely expensive to compute exactly are approximated, not by numerical methods, but through a statistical model that predicts the quantities from the trace of the hat matrix, which can be computed easily.  相似文献   

13.
Abstract

We develop and exemplify application of new classes of dynamic models for time series of nonnegative counts. Our novel univariate models combine dynamic generalized linear models for binary and conditionally Poisson time series, with dynamic random effects for over-dispersion. These models estimate dynamic regression coefficients in both binary and nonzero count components. Sequential Bayesian analysis allows fast, parallel analysis of sets of decoupled time series. New multivariate models then enable information sharing in contexts when data at a more highly aggregated level provide more incisive inferences on shared patterns such as trends and seasonality. A novel multiscale approach—one new example of the concept of decouple/recouple in time series—enables information sharing across series. This incorporates cross-series linkages while insulating parallel estimation of univariate models, and hence enables scalability in the number of series. The major motivating context is supermarket sales forecasting. Detailed examples drawn from a case study in multistep forecasting of sales of a number of related items showcase forecasting of multiple series, with discussion of forecast accuracy metrics, comparisons with existing methods, and broader questions of probabilistic forecast assessment.  相似文献   

14.
In this article, a semiparametric time‐varying nonlinear vector autoregressive (NVAR) model is proposed to model nonlinear vector time series data. We consider a combination of parametric and nonparametric estimation approaches to estimate the NVAR function for both independent and dependent errors. We use the multivariate Taylor series expansion of the link function up to the second order which has a parametric framework as a representation of the nonlinear vector regression function. After the unknown parameters are estimated by the maximum likelihood estimation procedure, the obtained NVAR function is adjusted by a nonparametric diagonal matrix, where the proposed adjusted matrix is estimated by the nonparametric kernel estimator. The asymptotic consistency properties of the proposed estimators are established. Simulation studies are conducted to evaluate the performance of the proposed semiparametric method. A real data example on short‐run interest rates and long‐run interest rates of United States Treasury securities is analyzed to demonstrate the application of the proposed approach. The Canadian Journal of Statistics 47: 668–687; 2019 © 2019 Statistical Society of Canada  相似文献   

15.
This paper defines and studies a new class of non-stationary random processes constructed from discrete non-decimated wavelets which generalizes the Cramér (Fourier) representation of stationary time series. We define an evolutionary wavelet spectrum (EWS) which quantifies how process power varies locally over time and scale. We show how the EWS may be rigorously estimated by a smoothed wavelet periodogram and how both these quantities may be inverted to provide an estimable time-localized autocovariance. We illustrate our theory with a pedagogical example based on discrete non-decimated Haar wavelets and also a real medical time series example.  相似文献   

16.
In this paper, we consider the auto-odds ratio function (AORF) as a measure of serial association for a stationary time series process of categorical data at two different time points. Numerical measures such as the autocorrelation function (ACF) have no meaningful interpretation, unless the time series data are numerical. Instead, we use the AORF as a measure of association to study the serial dependency of the categorical time series for both ordinal and nominal categories. Biswas and Song [Discrete-valued ARMA processes. Stat Probab Lett. 2009;79(17):1884–1889] provided some results on this measure for Pegram's operator-based AR(1) process with binary responses. Here, we extend this measure to more general set-ups, i.e. for AR(p) and MA(q) processes and for a general number of categories. We discuss how this method can effectively be used in parameter estimation and model selection. Following Weiß [Empirical measures of signed serial dependence in categorical time series. J Stat Comput Simul. 2011;81(4):411–429], we derive the large sample distribution of the estimator of the AORF under independent and identically distributed (iid) set-up. Some simulation results and two categorical data examples (one is ordinal and other nominal) are presented to illustrate the proposed method.  相似文献   

17.
In the first part of this article, we briefly review the history of seasonal adjustment and statistical time series analysis in order to understand why seasonal adjustment methods have evolved into their present form. This review provides insight into some of the problems that must be addressed by seasonal adjustment procedures and points out that advances in modern time series analysis raise the question of whether seasonal adjustment should be performed at all. This in turn leads to a discussion in the second part of issues invloved in seasonal adjustment. We state our opinions about the issues raised and renew some of the work of our authors. First, we comment on reasons that have been given for doing seasonal adjustment and suggest a new possible justification. We then emphasize the need to define precisely the seasonal and nonseasonal components and offer our definitions. Finally, we discuss our criteria for evaluating seasonal adjustments. We contend that proposed criteria based on empirical comparisons of estimated components are of little value and suggest that seasonal adjustment methods should be evaluated based on whether they are consistent with the information in the observed data. This idea is illustrated with an example.  相似文献   

18.
In the first part of this article, we briefly review the history of seasonal adjustment and statistical time series analysis in order to understand why seasonal adjustment methods have evolved into their present form. This review provides insight into some of the problems that must be addressed by seasonal adjustment procedures and points out that advances in modem time series analysis raise the question of whether seasonal adjustment should be performed at all. This in turn leads to a discussion in the second part of issues involved in seasonal adjustment. We state our opinions about the issues raised and review some of the work of other authors. First, we comment on reasons that have been given for doing seasonal adjustment and suggest a new possible justification. We then emphasize the need to define precisely the seasonal and nonseasonal components and offer our definitions. Finally, we discuss criteria for evaluating seasonal adjustments. We contend that proposed criteria based on empirical comparisons of estimated components are of little value and suggest that seasonal adjustment methods should be evaluated based on whether they are consistent with the information in the observed data. This idea is illustrated with an example.  相似文献   

19.
Abstract.  The presented method called Significant Non-stationarities, represents an exploratory tool for identifying significant changes in the mean, the variance, and the first-lag autocorrelation coefficient of a time series. The changes are detected on different time scales. The statistical inference for each scale is based on accurate approximation of the probability distribution, using test statistics being ratios of quadratic forms. No assumptions concerning the autocovariance function of the time series are made as the dependence structure is estimated non-parametrically. The results of the analyses are summarized in significance maps showing at which time points and on which time scales significant changes in the parameters occur. The performance of the given method is thoroughly studied by simulations in terms of observed significance level and power. Several examples, including a real temperature data set, are studied. The examples illustrate that it is important to carry out the analysis on several time horizons.  相似文献   

20.
Time series are often affected by interventions such as strikes, earthquakes, or policy changes. In the current paper, we build a practical nonparametric intervention model using the central mean subspace in time series. We estimate the central mean subspace for time series taking into account known interventions by using the Nadaraya–Watson kernel estimator. We use the modified Bayesian information criterion to estimate the unknown lag and dimension. Finally, we demonstrate that this nonparametric approach for intervened time series performs well in simulations and in a real data analysis such as the Monthly average of the oxidant.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号