首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Abstract. General autoregressive moving average (ARMA) models extend the traditional ARMA models by removing the assumptions of causality and invertibility. The assumptions are not required under a non‐Gaussian setting for the identifiability of the model parameters in contrast to the Gaussian setting. We study M‐estimation for general ARMA processes with infinite variance, where the distribution of innovations is in the domain of attraction of a non‐Gaussian stable law. Following the approach taken by Davis et al. (1992) and Davis (1996) , we derive a functional limit theorem for random processes based on the objective function, and establish asymptotic properties of the M‐estimator. We also consider bootstrapping the M‐estimator and extend the results of Davis & Wu (1997) to the present setting so that statistical inferences are readily implemented. Simulation studies are conducted to evaluate the finite sample performance of the M‐estimation and bootstrap procedures. An empirical example of financial time series is also provided.  相似文献   

2.
We propose a new summary statistic for inhomogeneous intensity‐reweighted moment stationarity spatio‐temporal point processes. The statistic is defined in terms of the n‐point correlation functions of the point process, and it generalizes the J‐function when stationarity is assumed. We show that our statistic can be represented in terms of the generating functional and that it is related to the spatio‐temporal K‐function. We further discuss its explicit form under some specific model assumptions and derive ratio‐unbiased estimators. We finally illustrate the use of our statistic in practice. © 2014 Board of the Foundation of the Scandinavian Journal of Statistics  相似文献   

3.
The most common forecasting methods in business are based on exponential smoothing, and the most common time series in business are inherently non‐negative. Therefore it is of interest to consider the properties of the potential stochastic models underlying exponential smoothing when applied to non‐negative data. We explore exponential smoothing state space models for non‐negative data under various assumptions about the innovations, or error, process. We first demonstrate that prediction distributions from some commonly used state space models may have an infinite variance beyond a certain forecasting horizon. For multiplicative error models that do not have this flaw, we show that sample paths will converge almost surely to zero even when the error distribution is non‐Gaussian. We propose a new model with similar properties to exponential smoothing, but which does not have these problems, and we develop some distributional properties for our new model. We then explore the implications of our results for inference, and compare the short‐term forecasting performance of the various models using data on the weekly sales of over 300 items of costume jewelry. The main findings of the research are that the Gaussian approximation is adequate for estimation and one‐step‐ahead forecasting. However, as the forecasting horizon increases, the approximate prediction intervals become increasingly problematic. When the model is to be used for simulation purposes, a suitably specified scheme must be employed.  相似文献   

4.
Abstract. In geophysical and environmental problems, it is common to have multiple variables of interest measured at the same location and time. These multiple variables typically have dependence over space (and/or time). As a consequence, there is a growing interest in developing models for multivariate spatial processes, in particular, the cross‐covariance models. On the other hand, many data sets these days cover a large portion of the Earth such as satellite data, which require valid covariance models on a globe. We present a class of parametric covariance models for multivariate processes on a globe. The covariance models are flexible in capturing non‐stationarity in the data yet computationally feasible and require moderate numbers of parameters. We apply our covariance model to surface temperature and precipitation data from an NCAR climate model output. We compare our model to the multivariate version of the Matérn cross‐covariance function and models based on coregionalization and demonstrate the superior performance of our model in terms of AIC (and/or maximum loglikelihood values) and predictive skill. We also present some challenges in modelling the cross‐covariance structure of the temperature and precipitation data. Based on the fitted results using full data, we give the estimated cross‐correlation structure between the two variables.  相似文献   

5.
Abstract: The authors consider a class of models for spatio‐temporal processes based on convolving independent processes with a discrete kernel that is represented by a lower triangular matrix. They study two families of models. In the first one, spatial Gaussian processes with isotropic correlations are convoluted with a kernel that provides temporal dependencies. In the second family, AR(p) processes are convoluted with a kernel providing spatial interactions. The covariance structures associated with these two families are quite rich. Their covariance functions that are stationary and separable in space and time as well as time dependent nonseparable and nonisotropic ones.  相似文献   

6.
For first‐time‐in‐human studies with small molecules alternating cross‐over designs are often employed and at study end are analyzed using linear models. We discuss the impact of including a period effect in the model on the precision with which dose level contrasts can be estimated and quantify the bias of least squares estimators if a period effect is inherent in the data that is not accounted for in the model. We also propose two alternative designs that allow a more precise estimation of dose level contrasts compared with the standard design when period effects are included in the model. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

7.
Abstract. Motivated by applications of Poisson processes for modelling periodic time‐varying phenomena, we study a semi‐parametric estimator of the period of cyclic intensity function of a non‐homogeneous Poisson process. There are no parametric assumptions on the intensity function which is treated as an infinite dimensional nuisance parameter. We propose a new family of estimators for the period of the intensity function, address the identifiability and consistency issues and present simulations which demonstrate good performance of the proposed estimation procedure in practice. We compare our method to competing methods on synthetic data and apply it to a real data set from a call center.  相似文献   

8.
The authors propose a new type of scan statistic to test for the presence of space‐time clusters in point processes data, when the goal is to identify and evaluate the statistical significance of localized clusters. Their method is based only on point patterns for cases; it does not require any specific knowledge of the underlying population. The authors propose to scan the three‐dimensional space with a score test statistic under the null hypothesis that the underlying point process is an inhomogeneous Poisson point process with space and time separable intensity. The alternative is that there are one or more localized space‐time clusters. Their method has been implemented in a computationally efficient way so that it can be applied routinely. They illustrate their method with space‐time crime data from Belo Horizonte, a Brazilian city, in addition to presenting a Monte Carlo study to analyze the power of their new test.  相似文献   

9.
It is important to study historical temperature time series prior to the industrial revolution so that one can view the current global warming trend from a long‐term historical perspective. Because there are no instrumental records of such historical temperature data, climatologists have been interested in reconstructing historical temperatures using various proxy time series. In this paper, the authors examine a state‐space model approach for historical temperature reconstruction which not only makes use of the proxy data but also information on external forcings. A challenge in the implementation of this approach is the estimation of the parameters in the state‐space model. The authors developed two maximum likelihood methods for parameter estimation and studied the efficiency and asymptotic properties of the associated estimators through a combination of theoretical and numerical investigations. The Canadian Journal of Statistics 38: 488–505; 2010 © 2010 Crown in the right of Canada  相似文献   

10.
Researchers familiar with spatial models are aware of the challenge of choosing the level of spatial aggregation. Few studies have been published on the investigation of temporal aggregation and its impact on inferences regarding disease outcome in space–time analyses. We perform a case study for modelling individual disease outcomes using several Bayesian hierarchical spatio‐temporal models, while taking into account the possible impact of spatial and temporal aggregation. Using longitudinal breast cancer data from South East Queensland, Australia, we consider both parametric and non‐parametric formulations for temporal effects at various levels of aggregation. Two temporal smoothness priors are considered separately; each is modelled with fixed effects for the covariates and an intrinsic conditional autoregressive prior for the spatial random effects. Our case study reveals that different model formulations produce considerably different model performances. For this particular dataset, a classical parametric formulation that assumes a linear time trend produces the best fit among the five models considered. Different aggregation levels of temporal random effects were found to have little impact on model goodness‐of‐fit and estimation of fixed effects.  相似文献   

11.
This paper presents a non‐parametric method for estimating the conditional density associated to the jump rate of a piecewise‐deterministic Markov process. In our framework, the estimation needs only one observation of the process within a long time interval. Our method relies on a generalization of Aalen's multiplicative intensity model. We prove the uniform consistency of our estimator, under some reasonable assumptions related to the primitive characteristics of the process. A simulation study illustrates the behaviour of our estimator.  相似文献   

12.
Abstract. We investigate resampling methodologies for testing the null hypothesis that two samples of labelled landmark data in three dimensions come from populations with a common mean reflection shape or mean reflection size‐and‐shape. The investigation includes comparisons between (i) two different test statistics that are functions of the projection onto tangent space of the data, namely the James statistic and an empirical likelihood statistic; (ii) bootstrap and permutation procedures; and (iii) three methods for resampling under the null hypothesis, namely translating in tangent space, resampling using weights determined by empirical likelihood and using a novel method to transform the original sample entirely within refection shape space. We present results of extensive numerical simulations, on which basis we recommend a bootstrap test procedure that we expect will work well in practise. We demonstrate the procedure using a data set of human faces, to test whether humans in different age groups have a common mean face shape.  相似文献   

13.
The stratified Cox model is commonly used for stratified clinical trials with time‐to‐event endpoints. The estimated log hazard ratio is approximately a weighted average of corresponding stratum‐specific Cox model estimates using inverse‐variance weights; the latter are optimal only under the (often implausible) assumption of a constant hazard ratio across strata. Focusing on trials with limited sample sizes (50‐200 subjects per treatment), we propose an alternative approach in which stratum‐specific estimates are obtained using a refined generalized logrank (RGLR) approach and then combined using either sample size or minimum risk weights for overall inference. Our proposal extends the work of Mehrotra et al, to incorporate the RGLR statistic, which outperforms the Cox model in the setting of proportional hazards and small samples. This work also entails development of a remarkably accurate plug‐in formula for the variance of RGLR‐based estimated log hazard ratios. We demonstrate using simulations that our proposed two‐step RGLR analysis delivers notably better results through smaller estimation bias and mean squared error and larger power than the stratified Cox model analysis when there is a treatment‐by‐stratum interaction, with similar performance when there is no interaction. Additionally, our method controls the type I error rate while the stratified Cox model does not in small samples. We illustrate our method using data from a clinical trial comparing two treatments for colon cancer.  相似文献   

14.
Abstract. In this article, we propose a new parametric family of models for real‐valued spatio‐temporal stochastic processes S ( x , t ) and show how low‐rank approximations can be used to overcome the computational problems that arise in fitting the proposed class of models to large datasets. Separable covariance models, in which the spatio‐temporal covariance function of S ( x , t ) factorizes into a product of purely spatial and purely temporal functions, are often used as a convenient working assumption but are too inflexible to cover the range of covariance structures encountered in applications. We define positive and negative non‐separability and show that in our proposed family we can capture positive, zero and negative non‐separability by varying the value of a single parameter.  相似文献   

15.
This paper is about vector autoregressive‐moving average models with time‐dependent coefficients to represent non‐stationary time series. Contrary to other papers in the univariate case, the coefficients depend on time but not on the series' length n. Under appropriate assumptions, it is shown that a Gaussian quasi‐maximum likelihood estimator is almost surely consistent and asymptotically normal. The theoretical results are illustrated by means of two examples of bivariate processes. It is shown that the assumptions underlying the theoretical results apply. In the second example, the innovations are marginally heteroscedastic with a correlation ranging from ?0.8 to 0.8. In the two examples, the asymptotic information matrix is obtained in the Gaussian case. Finally, the finite‐sample behaviour is checked via a Monte Carlo simulation study for n from 25 to 400. The results confirm the validity of the asymptotic properties even for short series and the asymptotic information matrix deduced from the theory.  相似文献   

16.
The semi‐Markov process often provides a better framework than the classical Markov process for the analysis of events with multiple states. The purpose of this paper is twofold. First, we show that in the presence of right censoring, when the right end‐point of the support of the censoring time is strictly less than the right end‐point of the support of the semi‐Markov kernel, the transition probability of the semi‐Markov process is nonidentifiable, and the estimators proposed in the literature are inconsistent in general. We derive the set of all attainable values for the transition probability based on the censored data, and we propose a nonparametric inference procedure for the transition probability using this set. Second, the conventional approach to constructing confidence bands is not applicable for the semi‐Markov kernel and the sojourn time distribution. We propose new perturbation resampling methods to construct these confidence bands. Different weights and transformations are explored in the construction. We use simulation to examine our proposals and illustrate them with hospitalization data from a recent cancer survivor study. The Canadian Journal of Statistics 41: 237–256; 2013 © 2013 Statistical Society of Canada  相似文献   

17.
In this paper, we consider the problem of adaptive density or survival function estimation in an additive model defined by Z=X+Y with X independent of Y, when both random variables are non‐negative. This model is relevant, for instance, in reliability fields where we are interested in the failure time of a certain material that cannot be isolated from the system it belongs. Our goal is to recover the distribution of X (density or survival function) through n observations of Z, assuming that the distribution of Y is known. This issue can be seen as the classical statistical problem of deconvolution that has been tackled in many cases using Fourier‐type approaches. Nonetheless, in the present case, the random variables have the particularity to be supported. Knowing that, we propose a new angle of attack by building a projection estimator with an appropriate Laguerre basis. We present upper bounds on the mean squared integrated risk of our density and survival function estimators. We then describe a non‐parametric data‐driven strategy for selecting a relevant projection space. The procedures are illustrated with simulated data and compared with the performances of a more classical deconvolution setting using a Fourier approach. Our procedure achieves faster convergence rates than Fourier methods for estimating these functions.  相似文献   

18.
The authors propose a two‐state continuous‐time semi‐Markov model for an unobservable alternating binary process. Another process is observed at discrete time points that may misclassify the true state of the process of interest. To estimate the model's parameters, the authors propose a minimum Pearson chi‐square type estimating approach based on approximated joint probabilities when the true process is in equilibrium. Three consecutive observations are required to have sufficient degrees of freedom to perform estimation. The methodology is demonstrated on parasitic infection data with exponential and gamma sojourn time distributions.  相似文献   

19.
We consider the blinded sample size re‐estimation based on the simple one‐sample variance estimator at an interim analysis. We characterize the exact distribution of the standard two‐sample t‐test statistic at the final analysis. We describe a simulation algorithm for the evaluation of the probability of rejecting the null hypothesis at given treatment effect. We compare the blinded sample size re‐estimation method with two unblinded methods with respect to the empirical type I error, the empirical power, and the empirical distribution of the standard deviation estimator and final sample size. We characterize the type I error inflation across the range of standardized non‐inferiority margin for non‐inferiority trials, and derive the adjusted significance level to ensure type I error control for given sample size of the internal pilot study. We show that the adjusted significance level increases as the sample size of the internal pilot study increases. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

20.
For many environmental processes, recent studies have shown that the dependence strength is decreasing when quantile levels increase. This implies that the popular max‐stable models are inadequate to capture the rate of joint tail decay, and to estimate joint extremal probabilities beyond observed levels. We here develop a more flexible modeling framework based on the class of max‐infinitely divisible processes, which extend max‐stable processes while retaining dependence properties that are natural for maxima. We propose two parametric constructions for max‐infinitely divisible models, which relax the max‐stability property but remain close to some popular max‐stable models obtained as special cases. The first model considers maxima over a finite, random number of independent observations, while the second model generalizes the spectral representation of max‐stable processes. Inference is performed using a pairwise likelihood. We illustrate the benefits of our new modeling framework on Dutch wind gust maxima calculated over different time units. Results strongly suggest that our proposed models outperform other natural models, such as the Student‐t copula process and its max‐stable limit, even for large block sizes.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号