首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Time series within fields such as finance and economics are often modelled using long memory processes. Alternative studies on the same data can suggest that series may actually contain a ‘changepoint’ (a point within the time series where the data generating process has changed). These models have been shown to have elements of similarity, such as within their spectrum. Without prior knowledge this leads to an ambiguity between these two models, meaning it is difficult to assess which model is most appropriate. We demonstrate that considering this problem in a time-varying environment using the time-varying spectrum removes this ambiguity. Using the wavelet spectrum, we then use a classification approach to determine the most appropriate model (long memory or changepoint). Simulation results are presented across a number of models followed by an application to stock cross-correlations and US inflation. The results indicate that the proposed classification outperforms an existing hypothesis testing approach on a number of models and performs comparatively across others.  相似文献   

2.
We consider Bayesian analysis of a class of multiple changepoint models. While there are a variety of efficient ways to analyse these models if the parameters associated with each segment are independent, there are few general approaches for models where the parameters are dependent. Under the assumption that the dependence is Markov, we propose an efficient online algorithm for sampling from an approximation to the posterior distribution of the number and position of the changepoints. In a simulation study, we show that the approximation introduced is negligible. We illustrate the power of our approach through fitting piecewise polynomial models to data, under a model which allows for either continuity or discontinuity of the underlying curve at each changepoint. This method is competitive with, or outperform, other methods for inferring curves from noisy data; and uniquely it allows for inference of the locations of discontinuities in the underlying curve.  相似文献   

3.
In this paper, we consider the problem of estimating a single changepoint in a parameter‐driven model. The model – an extension of the Poisson regression model – accounts for serial correlation through a latent process incorporated in its mean function. Emphasis is placed on the changepoint characterization with changes in the parameters of the model. The model is fully implemented within the Bayesian framework. We develop a RJMCMC algorithm for parameter estimation and model determination. The algorithm embeds well‐devised Metropolis–Hastings procedures for estimating the missing values of the latent process through data augmentation and the changepoint. The methodology is illustrated using data on monthly counts of claimants collecting wage loss benefit for injuries in the workplace and an analysis of presidential uses of force in the USA.  相似文献   

4.
ABSTRACT

Likelihood ratio tests for a change in mean in a sequence of independent, normal random variables are based on the maximum two-sample t-statistic, where the maximum is taken over all possible changepoints. The maximum t-statistic has the undesirable characteristic that Type I errors are not uniformly distributed across possible changepoints. False positives occur more frequently near the ends of the sequence and occur less frequently near the middle of the sequence. In this paper we describe an alternative statistic that is based upon a minimum p-value, where the minimum is taken over all possible changepoints. The p-value at any particular changepoint is based upon both the two-sample t-statistic at that changepoint and the probability that the maximum two-sample t-statistic is achieved at that changepoint. The new statistic has a more uniform distribution of Type I errors across potential changepoints and it compares favorably with respect to statistical power, false discovery rates, and the mean square error of changepoint estimates.  相似文献   

5.
In modelling financial return time series and time-varying volatility, the Gaussian and the Student-t distributions are widely used in stochastic volatility (SV) models. However, other distributions such as the Laplace distribution and generalized error distribution (GED) are also common in SV modelling. Therefore, this paper proposes the use of the generalized t (GT) distribution whose special cases are the Gaussian distribution, Student-t distribution, Laplace distribution and GED. Since the GT distribution is a member of the scale mixture of uniform (SMU) family of distribution, we handle the GT distribution via its SMU representation. We show this SMU form can substantially simplify the Gibbs sampler for Bayesian simulation-based computation and can provide a mean of identifying outliers. In an empirical study, we adopt a GT–SV model to fit the daily return of the exchange rate of Australian dollar to three other currencies and use the exchange rate to US dollar as a covariate. Model implementation relies on Bayesian Markov chain Monte Carlo algorithms using the WinBUGS package.  相似文献   

6.
ABSTRACT

This work presents advanced computational aspects of a new method for changepoint detection on spatio-temporal point process data. We summarize the methodology, based on building a Bayesian hierarchical model for the data and declaring prior conjectures on the number and positions of the changepoints, and show how to take decisions regarding the acceptance of potential changepoints. The focus of this work is about choosing an approach that detects the correct changepoint and delivers smooth reliable estimates in a feasible computational time; we propose Bayesian P-splines as a suitable tool for managing spatial variation, both under a computational and a model fitting performance perspective. The main computational challenges are outlined and a solution involving parallel computing in R is proposed and tested on a simulation study. An application is also presented on a data set of seismic events in Italy over the last 20 years.  相似文献   

7.
We consider the problem of testing the hypothesis that the correlation coefficient is stable in a sequence of n observations of independent, bivariate normal random variables against the alternative that the correlation coefficient changes after an unknown point t(t < n). We propose an estimate of the changepoint t and report on power comparisons between the commonly used test for this problem and our proposed test. Some applications to finance are discussed.  相似文献   

8.
In clinical practice, the profile of each subject's CD4 response from a longitudinal study may follow a ‘broken stick’ like trajectory, indicating multiple phases of increase and/or decline in response. Such multiple phases (changepoints) may be important indicators to help quantify treatment effect and improve management of patient care. Although it is a common practice to analyze complex AIDS longitudinal data using nonlinear mixed-effects (NLME) or nonparametric mixed-effects (NPME) models in the literature, NLME or NPME models become a challenge to estimate changepoint due to complicated structures of model formulations. In this paper, we propose a changepoint mixed-effects model with random subject-specific parameters, including the changepoint for the analysis of longitudinal CD4 cell counts for HIV infected subjects following highly active antiretroviral treatment. The longitudinal CD4 data in this study may exhibit departures from symmetry, may encounter missing observations due to various reasons, which are likely to be non-ignorable in the sense that missingness may be related to the missing values, and may be censored at the time of the subject going off study-treatment, which is a potentially informative dropout mechanism. Inferential procedures can be complicated dramatically when longitudinal CD4 data with asymmetry (skewness), incompleteness and informative dropout are observed in conjunction with an unknown changepoint. Our objective is to address the simultaneous impact of skewness, missingness and informative censoring by jointly modeling the CD4 response and dropout time processes under a Bayesian framework. The method is illustrated using a real AIDS data set to compare potential models with various scenarios, and some interested results are presented.  相似文献   

9.
In this article, we assess Bayesian estimation and prediction using integrated Laplace approximation (INLA) on a stochastic volatility (SV) model. This was performed through a Monte Carlo study with 1,000 simulated time series. To evaluate the estimation method, two criteria were considered: the bias and square root of the mean square error (smse). The criteria used for prediction are the one step ahead forecast of volatility and the one day Value at Risk (VaR). The main findings are that the INLA approximations are fairly accurate and relatively robust to the choice of prior distribution on the persistence parameter. Additionally, VaR estimates are computed and compared for three financial time series returns indexes.  相似文献   

10.
Abstract. We investigate simulation methodology for Bayesian inference in Lévy‐driven stochastic volatility (SV) models. Typically, Bayesian inference from such models is performed using Markov chain Monte Carlo (MCMC); this is often a challenging task. Sequential Monte Carlo (SMC) samplers are methods that can improve over MCMC; however, there are many user‐set parameters to specify. We develop a fully automated SMC algorithm, which substantially improves over the standard MCMC methods in the literature. To illustrate our methodology, we look at a model comprised of a Heston model with an independent, additive, variance gamma process in the returns equation. The driving gamma process can capture the stylized behaviour of many financial time series and a discretized version, fit in a Bayesian manner, has been found to be very useful for modelling equity data. We demonstrate that it is possible to draw exact inference, in the sense of no time‐discretization error, from the Bayesian SV model.  相似文献   

11.
In this study, a changepoint model, which can detect either a mean shift or a trend change when accounting for autocorrelation in short time-series, was investigated with simulations and a new method is proposed. The changepoint hypotheses were tested using a likelihood ratio test. The test statistic does not follow a known distribution and depends on the length of the time-series and the autocorrelation. The results imply that it is not possible to detect autocorrelation and that the estimate of the autocorrelation parameter is biased. It is therefore recommended to use critical values from the empirical distribution for a fixed autocorrelation.  相似文献   

12.
13.
A general model for changepoint problems is discussed from a nonparametric viewpoint. The test statistics introduced are based on Cramér-von Mises functionals of certain processes and are shown to converge in distribution to corresponding Gaussian functionals (under the assumption of no change in distribution, H0). We also demonstrate how the distribution of the limiting Gaussian functionals may be tabulated. Finally, properties of the tests under the alternative hypothesis of exactly one changepoint occurring are studied, and some examples are given.  相似文献   

14.
For a segmented regression system with an unknown changepoint over two domains of a predictor, a new empirical likelihood ratio statistic is proposed to test the null hypothesis of no change. Under the null hypothesis of no change, the proposed test statistic is shown empirically to be Gumbel distributed with robust location and scale estimators against various parameter settings and error distributions. A power analysis is conducted to illustrate the performance of the test. Under the alternative hypothesis with a changepoint, the test statistic is utilized to estimate the changepoint between the two domains. A comparison of the frequency distributions between the proposed estimator and two parametric methods indicates that the proposed method is effective in capturing the true changepoint.  相似文献   

15.
A new core methodology for creating nonparametric L-quantile estimators is introduced and three new quantile L-estimators (SV1 p , SV2 p , and SV3 p ) are constructed using the new methodology. Monte Carlo simulation was used in order to investigate the performance of the new estimators for small and large samples under normal distribution and a variety of light and heavy-tailed symmetric and asymmetric distributions. The new estimators outperform, in most of the cases studied, the Harrell–Davis quantile estimator and the weighted average at X ([np]) quantile estimator.  相似文献   

16.
A general framework is presented for Bayesian inference of multivariate time series exhibiting long-range dependence. The series are modelled using a vector autoregressive fractionally integrated moving-average (VARFIMA) process, which can capture both short-term correlation structure and long-range dependence characteristics of the individual series, as well as interdependence and feedback relationships between the series. To facilitate a sampling-based Bayesian approach, the exact joint posterior density is derived for the parameters, in a form that is computationally simpler than direct evaluation of the likelihood, and a modified Gibbs sampling algorithm is used to generate samples from the complete conditional distribution associated with each parameter. The paper also shows how an approximate form of the joint posterior density may be used for long time series. The procedure is illustrated using sea surface temperatures measured at three locations along the central California coast. These series are believed to be interdependent due to similarities in local atmospheric conditions at the different locations, and previous studies have found that they exhibit ‘long memory’ when studied individually. The approach adopted here permits investigation of the effects on model estimation of the interdependence and feedback relationships between the series.  相似文献   

17.
The choice of the model framework in a regression setting depends on the nature of the data. The focus of this study is on changepoint data, exhibiting three phases: incoming and outgoing, both of which are linear, joined by a curved transition. Bent-cable regression is an appealing statistical tool to characterize such trajectories, quantifying the nature of the transition between the two linear phases by modeling the transition as a quadratic phase with unknown width. We demonstrate that a quadratic function may not be appropriate to adequately describe many changepoint data. We then propose a generalization of the bent-cable model by relaxing the assumption of the quadratic bend. The properties of the generalized model are discussed and a Bayesian approach for inference is proposed. The generalized model is demonstrated with applications to three data sets taken from environmental science and economics. We also consider a comparison among the quadratic bent-cable, generalized bent-cable and piecewise linear models in terms of goodness of fit in analyzing both real-world and simulated data. This study suggests that the proposed generalization of the bent-cable model can be valuable in adequately describing changepoint data that exhibit either an abrupt or gradual transition over time.  相似文献   

18.
The extremogram is a useful tool for measuring extremal dependence and checking model adequacy in a time series. We define the extremogram in the spatial domain when the data is observed on a lattice or at locations distributed as a Poisson point process in d‐dimensional space. We establish a central limit theorem for the empirical spatial extremogram. We show these conditions are applicable for max‐moving average processes and Brown–Resnick processes and illustrate the empirical extremogram's performance via simulation. We also demonstrate its practical use with a data set related to rainfall in a region in Florida and ground‐level ozone in the eastern United States.  相似文献   

19.
A cohort of 300 women with breast cancer who were submitted for surgery is analysed by using a non-homogeneous Markov process. Three states are onsidered: no relapse, relapse and death. As relapse times change over time, we have extended previous approaches for a time homogeneous model to a non omogeneous multistate process. The trends of the hazard rate functions of transitions between states increase and then decrease, showing that a changepoint can be considered. Piecewise Weibull distributions are introduced as transition intensity functions. Covariates corresponding to treatments are incorporated in the model multiplicatively via these functions. The likelihood function is built for a general model with k changepoints and applied to the data set, the parameters are estimated and life-table and transition probabilities for treatments in different periods of time are given. The survival probability functions for different treatments are plotted and compared with the corresponding function for the homogeneous model. The survival functions for the various cohorts submitted for treatment are fitted to the mpirical survival functions.  相似文献   

20.
A likelihood approach is considered for the problems of estimating the changepoint and other parameters in a multivariable two-phase regression.Methods for finding the maximum likelihood estimates are given for the cases when the covariance matrix is known, and unknown.The distribution of the usual likelihood ratio test statistic is Investigated using simulations, and a Monte-Carlo aporoach is suggested for testing for the existence of a change-point.Numerical1 Illistrute aie provided.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号