首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Multiplier bootstrap methods for conditional distributions   总被引:1,自引:0,他引:1  
The multiplier bootstrap is a fast and easy-to-implement alternative to the standard bootstrap; it has been used successfully in many statistical contexts. In this paper, resampling methods based on multipliers are proposed in a general framework where one investigates the stochastic behavior of a random vector \(\mathbf {Y}\in \mathbb {R}^d\) conditional on a covariate \(X \in \mathbb {R}\). Specifically, two versions of the multiplier bootstrap adapted to empirical conditional distributions are introduced as alternatives to the conditional bootstrap and their asymptotic validity is formally established. As the method walks hand-in-hand with the functional delta method, theory around the estimation of statistical functionals is developed accordingly; this includes the interval estimation of conditional mean and variance, conditional correlation coefficient, Kendall’s dependence measure and copula. Composite inference about univariate and joint conditional distributions is also considered. The sample behavior of the new bootstrap schemes and related estimation methods are investigated via simulations and an illustration on real data is provided.  相似文献   

2.

We consider a sieve bootstrap procedure to quantify the estimation uncertainty of long-memory parameters in stationary functional time series. We use a semiparametric local Whittle estimator to estimate the long-memory parameter. In the local Whittle estimator, discrete Fourier transform and periodogram are constructed from the first set of principal component scores via a functional principal component analysis. The sieve bootstrap procedure uses a general vector autoregressive representation of the estimated principal component scores. It generates bootstrap replicates that adequately mimic the dependence structure of the underlying stationary process. We first compute the estimated first set of principal component scores for each bootstrap replicate and then apply the semiparametric local Whittle estimator to estimate the memory parameter. By taking quantiles of the estimated memory parameters from these bootstrap replicates, we can nonparametrically construct confidence intervals of the long-memory parameter. As measured by coverage probability differences between the empirical and nominal coverage probabilities at three levels of significance, we demonstrate the advantage of using the sieve bootstrap compared to the asymptotic confidence intervals based on normality.

  相似文献   

3.
This paper focuses on the analysis of spatially correlated functional data. We propose a parametric model for spatial correlation and the between-curve correlation is modeled by correlating functional principal component scores of the functional data. Additionally, in the sparse observation framework, we propose a novel approach of spatial principal analysis by conditional expectation to explicitly estimate spatial correlations and reconstruct individual curves. Assuming spatial stationarity, empirical spatial correlations are calculated as the ratio of eigenvalues of the smoothed covariance surface Cov\((X_i(s),X_i(t))\) and cross-covariance surface Cov\((X_i(s), X_j(t))\) at locations indexed by i and j. Then a anisotropy Matérn spatial correlation model is fitted to empirical correlations. Finally, principal component scores are estimated to reconstruct the sparsely observed curves. This framework can naturally accommodate arbitrary covariance structures, but there is an enormous reduction in computation if one can assume the separability of temporal and spatial components. We demonstrate the consistency of our estimates and propose hypothesis tests to examine the separability as well as the isotropy effect of spatial correlation. Using simulation studies, we show that these methods have some clear advantages over existing methods of curve reconstruction and estimation of model parameters.  相似文献   

4.
This empirical paper presents a number of functional modelling and forecasting methods for predicting very short-term (such as minute-by-minute) electricity demand. The proposed functional methods slice a seasonal univariate time series (TS) into a TS of curves; reduce the dimensionality of curves by applying functional principal component analysis before using a univariate TS forecasting method and regression techniques. As data points in the daily electricity demand are sequentially observed, a forecast updating method can greatly improve the accuracy of point forecasts. Moreover, we present a non-parametric bootstrap approach to construct and update prediction intervals, and compare the point and interval forecast accuracy with some naive benchmark methods. The proposed methods are illustrated by the half-hourly electricity demand from Monday to Sunday in South Australia.  相似文献   

5.
Let \(\mathbf {X} = (X_1,\ldots ,X_p)\) be a stochastic vector having joint density function \(f_{\mathbf {X}}(\mathbf {x})\) with partitions \(\mathbf {X}_1 = (X_1,\ldots ,X_k)\) and \(\mathbf {X}_2 = (X_{k+1},\ldots ,X_p)\). A new method for estimating the conditional density function of \(\mathbf {X}_1\) given \(\mathbf {X}_2\) is presented. It is based on locally Gaussian approximations, but simplified in order to tackle the curse of dimensionality in multivariate applications, where both response and explanatory variables can be vectors. We compare our method to some available competitors, and the error of approximation is shown to be small in a series of examples using real and simulated data, and the estimator is shown to be particularly robust against noise caused by independent variables. We also present examples of practical applications of our conditional density estimator in the analysis of time series. Typical values for k in our examples are 1 and 2, and we include simulation experiments with values of p up to 6. Large sample theory is established under a strong mixing condition.  相似文献   

6.
The usual covariance estimates for data n-1 from a stationary zero-mean stochastic process {Xt} are the sample covariances Both direct and resampling approaches are used to estimate the variance of the sample covariances. This paper compares the performance of these variance estimates. Using a direct approach, we show that a consistent windowed periodogram estimate for the spectrum is more effective than using the periodogram itself. A frequency domain bootstrap for time series is proposed and analyzed, and we introduce a frequency domain version of the jackknife that is shown to be asymptotically unbiased and consistent for Gaussian processes. Monte Carlo techniques show that the time domain jackknife and subseries method cannot be recommended. For a Gaussian underlying series a direct approach using a smoothed periodogram is best; for a non-Gaussian series the frequency domain bootstrap appears preferable. For small samples, the bootstraps are dangerous: both the direct approach and frequency domain jackknife are better.  相似文献   

7.
We consider the problem of estimating the parameters of the covariance function of a stationary spatial random process. In spatial statistics, there are widely used parametric forms for the covariance functions, and various methods for estimating the parameters have been proposed in the literature. We develop a method for estimating the parameters of the covariance function that is based on a regression approach. Our method utilizes pairs of observations whose distances are closest to a value h>0h>0 which is chosen in a way that the estimated correlation at distance h is a predetermined value. We demonstrate the effectiveness of our procedure by simulation studies and an application to a water pH data set. Simulation studies show that our method outperforms all well-known least squares-based approaches to the variogram estimation and is comparable to the maximum likelihood estimation of the parameters of the covariance function. We also show that under a mixing condition on the random field, the proposed estimator is consistent for standard one parameter models for stationary correlation functions.  相似文献   

8.
This paper promotes information theoretic inference in the context of minimum distance estimation. Various score test statistics differ only through the embedded estimator of the variance of estimating functions. We resort to implied probabilities provided by the constrained maximization of generalized entropy to get a more accurate variance estimator under the null. We document, both by theoretical higher order expansions and by Monte-Carlo evidence, that our improved score tests have better finite-sample size properties. The competitiveness of our non-simulation based method with respect to bootstrap is confirmed in the example of inference on covariance structures previously studied by Horowitz (1998 Horowitz , J. ( 1998 ). Bootstrap methods for covariance structures . The Journal of Human Resources 33 : 3961 .[Crossref], [Web of Science ®] [Google Scholar]).  相似文献   

9.
We propose forecasting functional time series using weighted functional principal component regression and weighted functional partial least squares regression. These approaches allow for smooth functions, assign higher weights to more recent data, and provide a modeling scheme that is easily adapted to allow for constraints and other information. We illustrate our approaches using age-specific French female mortality rates from 1816 to 2006 and age-specific Australian fertility rates from 1921 to 2006, and show that these weighted methods improve forecast accuracy in comparison to their unweighted counterparts. We also propose two new bootstrap methods to construct prediction intervals, and evaluate and compare their empirical coverage probabilities.  相似文献   

10.
Fitting stochastic kinetic models represented by Markov jump processes within the Bayesian paradigm is complicated by the intractability of the observed-data likelihood. There has therefore been considerable attention given to the design of pseudo-marginal Markov chain Monte Carlo algorithms for such models. However, these methods are typically computationally intensive, often require careful tuning and must be restarted from scratch upon receipt of new observations. Sequential Monte Carlo (SMC) methods on the other hand aim to efficiently reuse posterior samples at each time point. Despite their appeal, applying SMC schemes in scenarios with both dynamic states and static parameters is made difficult by the problem of particle degeneracy. A principled approach for overcoming this problem is to move each parameter particle through a Metropolis-Hastings kernel that leaves the target invariant. This rejuvenation step is key to a recently proposed \(\hbox {SMC}^2\) algorithm, which can be seen as the pseudo-marginal analogue of an idealised scheme known as iterated batch importance sampling. Computing the parameter weights in \(\hbox {SMC}^2\) requires running a particle filter over dynamic states to unbiasedly estimate the intractable observed-data likelihood up to the current time point. In this paper, we propose to use an auxiliary particle filter inside the \(\hbox {SMC}^2\) scheme. Our method uses two recently proposed constructs for sampling conditioned jump processes, and we find that the resulting inference schemes typically require fewer state particles than when using a simple bootstrap filter. Using two applications, we compare the performance of the proposed approach with various competing methods, including two global MCMC schemes.  相似文献   

11.
This work presents a framework of dynamic structural models with covariates for short-term forecasting of time series with complex seasonal patterns. The framework is based on the multiple sources of randomness formulation. A noise model is formulated to allow the incorporation of randomness into the seasonal component and to propagate this same randomness in the coefficients of the variant trigonometric terms over time. A unique, recursive and systematic computational procedure based on the maximum likelihood estimation under the hypothesis of Gaussian errors is introduced. The referred procedure combines the Kalman filter with recursive adjustment of the covariance matrices and the selection method of harmonics number in the trigonometric terms. A key feature of this method is that it allows estimating not only the states of the system but also allows obtaining the standard errors of the estimated parameters and the prediction intervals. In addition, this work also presents a non-parametric bootstrap approach to improve the forecasting method based on Kalman filter recursions. The proposed framework is empirically explored with two real time series.  相似文献   

12.
This article deals with the estimation of the parametric component, which is of primary interest, in the heteroscedastic semi-varying coefficient models. Based on the bootstrap technique, we present a procedure for estimating the parameters, which can provide a reliable approximation to the asymptotic distribution of the profile least-square (PLS) estimator. Furthermore, a bootstrap-type estimator of covariance matrix is developed, which is proved to be a consistent estimator of the covariance matrix. Moreover, some simulation experiments are conducted to evaluate the finite sample performance for the proposed methodology. Finally, the Australia CPI dataset is analyzed to demonstrate the application of the methods.  相似文献   

13.
Univariate time series often take the form of a collection of curves observed sequentially over time. Examples of these include hourly ground-level ozone concentration curves. These curves can be viewed as a time series of functions observed at equally spaced intervals over a dense grid. Since functional time series may contain various types of outliers, we introduce a robust functional time series forecasting method to down-weigh the influence of outliers in forecasting. Through a robust principal component analysis based on projection pursuit, a time series of functions can be decomposed into a set of robust dynamic functional principal components and their associated scores. Conditioning on the estimated functional principal components, the crux of the curve-forecasting problem lies in modelling and forecasting principal component scores, through a robust vector autoregressive forecasting method. Via a simulation study and an empirical study on forecasting ground-level ozone concentration, the robust method demonstrates the superior forecast accuracy that dynamic functional principal component regression entails. The robust method also shows the superior estimation accuracy of the parameters in the vector autoregressive models for modelling and forecasting principal component scores, and thus improves curve forecast accuracy.  相似文献   

14.
Singular spectrum analysis (SSA) is a non-parametric time series modelling technique where an observed time series is unfolded into the column vectors of a Hankel structured matrix, known as a trajectory matrix. For noise-free signals the column vectors of the trajectory matrix lie on a single R-flat. Singular value decomposition (SVD) can be used to find the orthonormal base vectors of the linear subspace parallel to this R-flat. SSA can essentially handle functions that are governed by a linear recurrent formula (LRF) and include the broad class of functions that was proposed by Buchstaber [1994. Time series analysis and Grassmannians. Amer. Math. Soc. Transl. 162 (2), 1–17]. SSA is useful to model time series with complex cyclical patterns that increase over time.Various methods have been studied to extend SSA for application to several time series, see Golyandina et al. [2003. Variants of the Caterpillar SSA-method for analysis of multidimensional time series (in Russian) hhttp://www.gistatgroup.com/cat/i]. Prior to that Von Storch and Zwiers (1999) and Allen and Robertson (1996) (see Ghil et al. [2002. Advanced spectral methods for climatic time series. Rev. Geophys. 40 (1), 3.1–3.41]) used multi-channel SSA (M-SSA), to apply SSA to “grand” block matrices. Our approach is different from all of these by using the common principal components approaches introduced by Flury [1988. Common Principal Components and Related Multivariate Models. Wiley, New York]. In this paper SSA is extended to several time series which are similar in some respects, like cointegrated, i.e. sharing a common R-flat. By using the common principal component (CPC) approach of Flury [1988. Common Principal Components and Related Multivariate Models. Wiley, New York] the SSA method is extended to common singular spectrum analysis (CSSA) where common features of several time series can be studied. CSSA decomposes the different original time series into the sum of a common small number of components which are related to common trend and oscillatory components and noise. The determination of the most likely dimension of the supporting linear subspace is studied using a heuristic approach and a hierarchical selection procedure.  相似文献   

15.
For testing the problem of regions in the space of distribution functions, this paper considers approaches to modify the bootstrap probability to be a second-order accurate pp-value based on the familiar bias correction and acceleration method. It is shown that Shimodaira's [2004a. Approximately unbiased tests of regions using multistep-multiscale bootstrap resampling. Ann. Statist. 32, 2616–2641] twostep-multiscale bootstrap method works even in the problem of regions in functional space. In this paper the bias correction quantity is estimated by his onestep-multiscale bootstrap method. Instead of using the twostep-multiscale bootstrap method, the acceleration constant is estimated by a newly proposed jackknife method which requires first-level bootstrap resamplings only. Some numerical examples are illustrated, in which an application to testing significance in model selection is included.  相似文献   

16.
We discuss higher-order adjustments for a quasi-profile likelihood for a scalar parameter of interest, in order to alleviate some of the problems inherent to the presence of nuisance parameters, such as bias and inconsistency. Indeed, quasi-profile score functions for the parameter of interest have bias of order O(1)O(1), and such bias can lead to poor inference on the parameter of interest. The higher-order adjustments are obtained so that the adjusted quasi-profile score estimating function is unbiased and its variance is the negative expected derivative matrix of the adjusted profile estimating equation. The modified quasi-profile likelihood is then obtained as the integral of the adjusted profile estimating function. We discuss two methods for the computation of the modified quasi-profile likelihoods: a bootstrap simulation method and a first-order asymptotic expression, which can be simplified under an orthogonality assumption. Examples in the context of generalized linear models and of robust inference are provided, showing that the use of a modified quasi-profile likelihood ratio statistic may lead to coverage probabilities more accurate than those pertaining to first-order Wald-type confidence intervals.  相似文献   

17.
In this paper we consider the inferential aspect of the nonparametric estimation of a conditional function , where X t,m represents the vector containing the m conditioning lagged values of the series. Here is an arbitrary measurable function. The local polynomial estimator of order p is used for the estimation of the function g, and of its partial derivatives up to a total order p. We consider α-mixing processes, and we propose the use of a particular resampling method, the local polynomial bootstrap, for the approximation of the sampling distribution of the estimator. After analyzing the consistency of the proposed method, we present a simulation study which gives evidence of its finite sample behaviour.  相似文献   

18.
19.
Dynamic principal component analysis (DPCA), also known as frequency domain principal component analysis, has been developed by Brillinger [Time Series: Data Analysis and Theory, Vol. 36, SIAM, 1981] to decompose multivariate time-series data into a few principal component series. A primary advantage of DPCA is its capability of extracting essential components from the data by reflecting the serial dependence of them. It is also used to estimate the common component in a dynamic factor model, which is frequently used in econometrics. However, its beneficial property cannot be utilized when missing values are present, which should not be simply ignored when estimating the spectral density matrix in the DPCA procedure. Based on a novel combination of conventional DPCA and self-consistency concept, we propose a DPCA method when missing values are present. We demonstrate the advantage of the proposed method over some existing imputation methods through the Monte Carlo experiments and real data analysis.  相似文献   

20.
We study an autoregressive time series model with a possible change in the regression parameters. Approximations to the critical values for change-point tests are obtained through various bootstrapping methods. Theoretical results show that the bootstrapping procedures have the same limiting behavior as their asymptotic counterparts discussed in Hušková et al. [2007. On the detection of changes in autoregressive time series, I. Asymptotics. J. Statist. Plann. Inference 137, 1243–1259]. In fact, a small simulation study illustrates that the bootstrap tests behave better than the original asymptotic tests if performance is measured by the αα- and ββ-errors, respectively.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号