首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 202 毫秒
1.
Classical nondecimated wavelet transforms are attractive for many applications. When the data comes from complex or irregular designs, the use of second generation wavelets in nonparametric regression has proved superior to that of classical wavelets. However, the construction of a nondecimated second generation wavelet transform is not obvious. In this paper we propose a new ‘nondecimated’ lifting transform, based on the lifting algorithm which removes one coefficient at a time, and explore its behavior. Our approach also allows for embedding adaptivity in the transform, i.e. wavelet functions can be constructed such that their smoothness adjusts to the local properties of the signal. We address the problem of nonparametric regression and propose an (averaged) estimator obtained by using our nondecimated lifting technique teamed with empirical Bayes shrinkage. Simulations show that our proposed method has higher performance than competing techniques able to work on irregular data. Our construction also opens avenues for generating a ‘best’ representation, which we shall explore.  相似文献   

2.
Statistical inference in the wavelet domain remains a vibrant area of contemporary statistical research because of desirable properties of wavelet representations and the need of scientific community to process, explore, and summarize massive data sets. Prime examples are biomedical, geophysical, and internet related data. We propose two new approaches to wavelet shrinkage/thresholding.

In the spirit of Efron and Tibshirani's recent work on local false discovery rate, we propose Bayesian Local False Discovery Rate (BLFDR), where the underlying model on wavelet coefficients does not assume known variances. This approach to wavelet shrinkage is shown to be connected with shrinkage based on Bayes factors. The second proposal, Bayesian False Discovery Rate (BaFDR), is based on ordering of posterior probabilities of hypotheses on true wavelets coefficients being null, in Bayesian testing of multiple hypotheses.

We demonstrate that both approaches result in competitive shrinkage methods by contrasting them to some popular shrinkage techniques.  相似文献   

3.
Summary.  Wavelet shrinkage is an effective nonparametric regression technique, especially when the underlying curve has irregular features such as spikes or discontinuities. The basic idea is simple: take the discrete wavelet transform of data consisting of a signal corrupted by noise; shrink or remove the wavelet coefficients to remove the noise; then invert the discrete wavelet transform to form an estimate of the true underlying curve. Various researchers have proposed increasingly sophisticated methods of doing this by using real-valued wavelets. Complex-valued wavelets exist but are rarely used. We propose two new complex-valued wavelet shrinkage techniques: one based on multiwavelet style shrinkage and the other using Bayesian methods. Extensive simulations show that our methods almost always give significantly more accurate estimates than methods based on real-valued wavelets. Further, our multiwavelet style shrinkage method is both simpler and dramatically faster than its competitors. To understand the excellent performance of this method we present a new risk bound on its hard thresholded coefficients.  相似文献   

4.
Wavelets are a commonly used tool in science and technology. Often, their use involves applying a wavelet transform to the data, thresholding the coefficients and applying the inverse transform to obtain an estimate of the desired quantities. In this paper, we argue that it is often possible to gain more insight into the data by producing not just one, but many wavelet reconstructions using a range of threshold values and analysing the resulting object, which we term the Time–Threshold Map (TTM) of the input data. We discuss elementary properties of the TTM, in its “basic” and “derivative” versions, using both Haar and Unbalanced Haar wavelet families. We then show how the TTM can help in solving two statistical problems in the signal + noise model: breakpoint detection, and estimating the longest interval of approximate stationarity. We illustrate both applications with examples involving volatility of financial returns. We also briefly discuss other possible uses of the TTM.  相似文献   

5.
This paper defines and studies a new class of non-stationary random processes constructed from discrete non-decimated wavelets which generalizes the Cramér (Fourier) representation of stationary time series. We define an evolutionary wavelet spectrum (EWS) which quantifies how process power varies locally over time and scale. We show how the EWS may be rigorously estimated by a smoothed wavelet periodogram and how both these quantities may be inverted to provide an estimable time-localized autocovariance. We illustrate our theory with a pedagogical example based on discrete non-decimated Haar wavelets and also a real medical time series example.  相似文献   

6.
Multiple outcomes are increasingly used to assess chronic disease progression. We discuss and show how desirability functions can be used to assess a patient overall response to a treatment using multiple outcome measures and each of them may contribute unequally to the final assessment. Because judgments on disease progression and the relative contribution of each outcome can be subjective, we propose a data-driven approach to minimize the biases by using desirability functions with estimated shapes and weights based on a given gold standard. Our method provides each patient with a meaningful overall progression score that facilitates comparison and clinical interpretation. We also extend the methodology in a novel way to monitor patients’ disease progression when there are multiple time points and illustrate our method using a longitudinal data set from a randomized two-arm clinical trial for scleroderma patients.  相似文献   

7.
A novel approach to solve the independent component analysis (ICA) model in the presence of noise is proposed. We use wavelets as natural denoising tools to solve the noisy ICA model. To do this, we use a multivariate wavelet denoising algorithm allowing spatial and temporal dependency. We propose also using a statistical approach, named nested design of experiments, to select the parameters such as wavelet family and thresholding type. This technique helps us to select more convenient combination of the parameters. This approach could be extended to many other problems in which one needs to choose parameters between many choices. The performance of the proposed method is illustrated on the simulated data and promising results are obtained. Also, the suggested method applied in latent variables regression in the presence of noise on real data. The good results confirm the ability of multivariate wavelet denoising to solving noisy ICA.  相似文献   

8.
In this note we propose a newly formulated skew exponential power distribution that behaves substantially better than previously defined versions. This new model performs very well in terms of the large sample behavior of the maximum likelihood estimation procedure when compared to the classically defined four parameter model defined by Azzalini. More recently, approaches to defining a skew exponential power distribution have used five or more parameters. Our approach improves upon previous attempts to extend the symmetric power exponential family to include skew alternatives by maintaining a minimum set of four parameters corresponding directly to location, scale, skewness and kurtosis. We illustrate the utility of our proposed model using translational and clinical data sets.  相似文献   

9.
Density estimation for pre-binned data is challenging due to the loss of exact position information of the original observations. Traditional kernel density estimation methods cannot be applied when data are pre-binned in unequally spaced bins or when one or more bins are semi-infinite intervals. We propose a novel density estimation approach using the generalized lambda distribution (GLD) for data that have been pre-binned over a sequence of consecutive bins. This method enjoys the high power of the parametric model and the great shape flexibility of the GLD. The performances of the proposed estimators are benchmarked via simulation studies. Both simulation results and a real data application show that the proposed density estimators work well for data of moderate or large sizes.  相似文献   

10.
Summary. We use cumulants to derive Bayesian credible intervals for wavelet regression estimates. The first four cumulants of the posterior distribution of the estimates are expressed in terms of the observed data and integer powers of the mother wavelet functions. These powers are closely approximated by linear combinations of wavelet scaling functions at an appropriate finer scale. Hence, a suitable modification of the discrete wavelet transform allows the posterior cumulants to be found efficiently for any given data set. Johnson transformations then yield the credible intervals themselves. Simulations show that these intervals have good coverage rates, even when the underlying function is inhomogeneous, where standard methods fail. In the case where the curve is smooth, the performance of our intervals remains competitive with established nonparametric regression methods.  相似文献   

11.
We introduce a new goodness-of-fit test which can be applied to hypothesis testing about the marginal distribution of dependent data. We derive a new test for the equivalent hypothesis in the space of wavelet coefficients. Such properties of the wavelet transform as orthogonality, localisation and sparsity make the hypothesis testing in wavelet domain easier than in the domain of distribution functions. We propose to test the null hypothesis separately at each wavelet decomposition level to overcome the problem of bi-dimensionality of wavelet indices and to be able to find the frequency where the empirical distribution function differs from the null in case the null hypothesis is rejected. We suggest a test statistic and state its asymptotic distribution under the null and under some of the alternative hypotheses.  相似文献   

12.
Summary.  The purpose of the paper is to propose a frequency domain approach for irregularly spaced data on R d . We extend the original definition of a periodogram for time series to that for irregularly spaced data and define non-parametric and parametric spectral density estimators in a way that is similar to the classical approach. Introduction of the mixed asymptotics, which are one of the asymptotics for irregularly spaced data, makes it possible to provide asymptotic theories to the spectral estimators. The asymptotic result for the parametric estimator is regarded as a natural extension of the classical result for regularly spaced data to that for irregularly spaced data. Empirical studies are also included to illustrate the frequency domain approach in comparisons with the existing spatial and frequency domain approaches.  相似文献   

13.
Locally stationary wavelet (LSW) processes, built on non-decimated wavelets, can be used to analyse and forecast non-stationary time series. They have been proved useful in the analysis of financial data. In this paper, we first carry out a sensitivity analysis, then propose some practical guidelines for choosing the wavelet bases for these processes. The existing forecasting algorithm is found to be vulnerable to outliers, and a new algorithm is proposed to overcome the weakness. The new algorithm is shown to be stable and outperforms the existing algorithm when applied to real financial data. The volatility forecasting ability of LSW modelling based on our new algorithm is then discussed and shown to be competitive with traditional GARCH models.  相似文献   

14.
This paper investigates the modelling and forecasting method for non-stationary time series. Using wavelets, the authors propose a modelling procedure that decomposes the series as the sum of three separate components, namely trend, harmonic and irregular components. The estimates suggested in this paper are all consistent. This method has been used for the modelling of US dollar against DM exchange rate data, and ten steps ahead (2 weeks) forecasting are compared with several other methods. Under the Average Percentage of forecasting Error (APE) criterion, the wavelet approach is the best one. The results suggest that forecasting based on wavelets is a viable alternative to existing methods.  相似文献   

15.
We propose a method in order to maximize the accuracy in the estimation of piecewise constant and piecewise smooth variance functions in a nonparametric heteroscedastic fixed design regression model. The difference-based initial estimates are obtained from the given observations. Then an estimator is constructed by using iterative regularization method with the analysis-prior undecimated three-level Haar transform as regularizer term. We notice that this method shows better results in the mean square sense over an existing adaptive estimation procedure considering all the standard test functions used in addition to the functions that we target. Some simulations and comparisons with other methods are conducted to assess the performance of the proposed method.  相似文献   

16.
In this paper a new multivariate regression estimate is introduced. It is based on ideas derived in the context of wavelet estimates and is constructed by hard thresholding of estimates of coefficients of a series expansion of the regression function. Multivariate functions constructed analogously to the classical Haar wavelets are used for the series expansion. These functions are orthogonal in L2(μn)L2(μn), where μnμn denotes the empirical design measure. The construction can be considered as designing adapted Haar wavelets.  相似文献   

17.
The multiscale local polynomial transform, developped in this paper, combines the benefits from local polynomial smoothing with sparse multiscale decompositions. The contribution of the paper is twofold. First, it focusses on the bandwidths used throughout the transform. These bandwidths operate as user controlled scales in a multiscale analysis, which is explained to be of particular interest in the case of nonequispaced data. The paper presents both a likelihood based optimal bandwidth selection and a fast, heuristic approach. The second contribution of the paper is the combination of local polynomial smoothing with orthogonal prefilters, similar to Daubechies’ wavelet filters, but defined on irregularly spaced covariate values.  相似文献   

18.
Functional data analysis has emerged as a new area of statistical research with a wide range of applications. In this paper, we propose novel models based on wavelets for spatially correlated functional data. These models enable one to regularize curves observed over space and predict curves at unobserved sites. We compare the performance of these Bayesian models with several priors on the wavelet coefficients using the posterior predictive criterion. The proposed models are illustrated in the analysis of porosity data.  相似文献   

19.
ABSTRACT

In many clinical studies, patients are followed over time with their responses measured longitudinally. Using mixed model theory, one can characterize these data using a wide array of across subject models. A state-space representation of the mixed effects model and use of the Kalman filter allows one to have great flexibility in choosing the within error correlation structure even in the presence of missing or unequally spaced observations. Furthermore, using the state-space approach, one can avoid inverting large matrices resulting in efficient computation. The approach also allows one to make detailed inference about the error correlation structure. We consider a bivariate situation where the longitudinal responses are unequally spaced and assume that the within subject errors follows a continuous first-order autoregressive (CAR(1)) structure. Since a large number of nonlinear parameters need to be estimated, the modeling strategy and numerical techniques are critical in the process. We developed both a Visual Fortran® and a SAS® program for modeling such data. A simulation study was conducted to investigate the robustness of the model assumptions. We also use data from a psychiatric study to demonstrate our model fitting procedure.  相似文献   

20.
The article studies a time-varying coefficient time series model in which some of the covariates are measured with additive errors. In order to overcome the bias of estimator of the coefficient functions when measurement errors are ignored, we propose a modified least squares estimator based on wavelet procedures. The advantage of the wavelet method is to avoid the restrictive smoothness requirement for varying-coefficient functions of the traditional smoothing approaches, such as kernel and local polynomial methods. The asymptotic properties of the proposed wavelet estimators are established under the α-mixing conditions and without specifying the error distribution. These results can be used to make asymptotically valid statistical inference.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号