首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
基于小波分析提出了一种基金净值预测模型。此模型利用小波分析理论用改进的小波阈值去噪方法对基金净值数据进行去噪处理,再对经过去噪处理后得到的较为平稳的数据,利用计量经济学中时间序列自回归模型进行短期预测。研究证明:该预测模型能较好地预测基金净值的短期趋势,预测结果优于传统的基金净值预测模型。  相似文献   

2.
We can use wavelet shrinkage to estimate a possibly multivariate regression function g under the general regression setup, y = g + ε. We propose an enhanced wavelet-based denoising methodology based on Bayesian adaptive multiresolution shrinkage, an effective Bayesian shrinkage rule in addition to the semi-supervised learning mechanism. The Bayesian shrinkage rule is advanced by utilizing the semi-supervised learning method in which the neighboring structure of a wavelet coefficient is adopted and an appropriate decision function is derived. According to decision function, wavelet coefficients follow one of two prespecified Bayesian rules obtained using varying related parameters. The decision of a wavelet coefficient depends not only on its magnitude, but also on the neighboring structure on which the coefficient is located. We discuss the theoretical properties of the suggested method and provide recommended parameter settings. We show that the proposed method is often superior to several existing wavelet denoising methods through extensive experimentation.  相似文献   

3.
We present theoretical results on the random wavelet coefficients covariance structure. We use simple properties of the coefficients to derive a recursive way to compute the within- and across-scale covariances. We point out a useful link between the algorithm proposed and the two-dimensional discrete wavelet transform. We then focus on Bayesian wavelet shrinkage for estimating a function from noisy data. A prior distribution is imposed on the coefficients of the unknown function. We show how our findings on the covariance structure make it possible to specify priors that take into account the full correlation between coefficients through a parsimonious number of hyperparameters. We use Markov chain Monte Carlo methods to estimate the parameters and illustrate our method on bench-mark simulated signals.  相似文献   

4.
This paper addresses, via thresholding, the estimation of a possibly sparse signal observed subject to Gaussian noise. Conceptually, the optimal threshold for such problems depends upon the strength of the underlying signal. We propose two new methods that aim to adapt to potential local variation in this signal strength and select a variable threshold accordingly. Our methods are based upon an empirical Bayes approach with a smoothly variable mixing weight chosen via either spline or kernel based marginal maximum likelihood regression. We demonstrate the excellent performance of our methods in both one and two-dimensional estimation when compared to various alternative techniques. In addition, we consider the application to wavelet denoising where reconstruction quality is significantly improved with local adaptivity.  相似文献   

5.
We propose a density-tempered marginalized sequential Monte Carlo (SMC) sampler, a new class of samplers for full Bayesian inference of general state-space models. The dynamic states are approximately marginalized out using a particle filter, and the parameters are sampled via a sequential Monte Carlo sampler over a density-tempered bridge between the prior and the posterior. Our approach delivers exact draws from the joint posterior of the parameters and the latent states for any given number of state particles and is thus easily parallelizable in implementation. We also build into the proposed method a device that can automatically select a suitable number of state particles. Since the method incorporates sample information in a smooth fashion, it delivers good performance in the presence of outliers. We check the performance of the density-tempered SMC algorithm using simulated data based on a linear Gaussian state-space model with and without misspecification. We also apply it on real stock prices using a GARCH-type model with microstructure noise.  相似文献   

6.
This article proposes a simple nonparametric method to estimate the jump characteristics in asset price with noisy high-frequency data. We combine the pre-averaging approach and the threshold technique to identify the jumps, and then propose the pre-averaging threshold estimators for the number and sizes of jumps occurred. We further present the asymptotic properties of the proposed estimators. The Monte Carlo simulation shows that the estimators are robust to microstructure noise and work very well especially when the data frequency is ultra-high. Finally, an empirical example further demonstrates the power of the proposed method.  相似文献   

7.
Variable selection is fundamental to high-dimensional multivariate generalized linear models. The smoothly clipped absolute deviation (SCAD) method can solve the problem of variable selection and estimation. The choice of the tuning parameter in the SCAD method is critical, which controls the complexity of the selected model. This article proposes a criterion to select the tuning parameter for the SCAD method in multivariate generalized linear models, which is shown to be able to identify the true model consistently. Simulation studies are conducted to support theoretical findings, and two real data analysis are given to illustrate the proposed method.  相似文献   

8.
An important aspect in the modelling of biological phenomena in living organisms, whether the measurements are of blood pressure, enzyme levels, biomechanical movements or heartbeats, etc., is time variation in the data. Thus, the recovery of a 'smooth' regression or trend function from noisy time-varying sampled data becomes a problem of particular interest. Here we use non-linear wavelet thresholding to estimate a regression or a trend function in the presence of additive noise which, in contrast to most existing models, does not need to be stationary. (Here, non-stationarity means that the spectral behaviour of the noise is allowed to change slowly over time). We develop a procedure to adapt existing threshold rules to such situations, e.g. that of a time-varying variance in the errors. Moreover, in the model of curve estimation for functions belonging to a Besov class with locally stationary errors, we derive a near-optimal rate for the -risk between the unknown function and our soft or hard threshold estimator, which holds in the general case of an error distribution with bounded cumulants. In the case of Gaussian errors, a lower bound on the asymptotic minimax rate in the wavelet coefficient domain is also obtained. Also it is argued that a stronger adaptivity result is possible by the use of a particular location and level dependent threshold obtained by minimizing Stein's unbiased estimate of the risk. In this respect, our work generalizes previous results, which cover the situation of correlated, but stationary errors. A natural application of our approach is the estimation of the trend function of non-stationary time series under the model of local stationarity. The method is illustrated on both an interesting simulated example and a biostatistical data-set, measurements of sheep luteinizing hormone, which exhibits a clear non-stationarity in its variance.  相似文献   

9.
ABSTRACT

This work treats non-parametric estimation of multivariate probability mass functions, using multivariate discrete associated kernels. We propose a Bayesian local approach to select the matrix of bandwidths considering the multivariate Dirac Discrete Uniform and the product of binomial kernels, and treating the bandwidths as a diagonal matrix of parameters with some prior distribution. The performances of this approach and the cross-validation method are compared using simulations and real count data sets. The obtained results show that the Bayes local method performs better than cross-validation in terms of integrated squared error.  相似文献   

10.
Clustering algorithms are important methods widely used in mining data streams because of their abilities to deal with infinite data flows. Although these algorithms perform well to mining latent relationship in data streams, most of them suffer from loss of cluster purity and become unstable when the inputting data streams have too many noisy variables. In this article, we propose a clustering algorithm to cluster data streams with noisy variables. The result from simulation shows that our proposal method is better than previous studies by adding a process of variable selection as a component in clustering algorithms. The results of two experiments indicate that clustering data streams with the process of variable selection are more stable and have better purity than those without such process. Another experiment testing KDD-CUP99 dataset also shows that our algorithm can generate more stable result.  相似文献   

11.
In this article, a novel hybrid method to forecast stock price is proposed. This hybrid method is based on wavelet transform, wavelet denoising, linear models (autoregressive integrated moving average (ARIMA) model and exponential smoothing (ES) model), and nonlinear models (BP Neural Network and RBF Neural Network). The wavelet transform provides a set of better-behaved constitutive series than stock series for prediction. Wavelet denoising is used to eliminate some slight random fluctuations of stock series. ARIMA model and ES model are used to forecast the linear component of denoised stock series, and then BP Neural Network and RBF Neural Network are developed as tools for nonlinear pattern recognition to correct the estimation error of the prediction of linear models. The proposed method is examined in the stock market of Shanghai and Shenzhen and the results are compared with some of the most recent stock price forecast methods. The results show that the proposed hybrid method can provide a considerable improvement for the forecasting accuracy. Meanwhile, this proposed method can also be applied to analysis and forecast reliability of products or systems and improve the accuracy of reliability engineering.  相似文献   

12.
Wavelet Threshold Estimators for Data with Correlated Noise   总被引:1,自引:0,他引:1  
Wavelet threshold estimators for data with stationary correlated noise are constructed by applying a level-dependent soft threshold to the coefficients in the wavelet transform. A variety of threshold choices is proposed, including one based on an unbiased estimate of mean-squared error. The practical performance of the method is demonstrated on examples, including data from a neurophysiological context. The theoretical properties of the estimators are investigated by comparing them with an ideal but unattainable `bench-mark', that can be considered in the wavelet context as the risk obtained by ideal spatial adaptivity, and more generally is obtained by the use of an `oracle' that provides information that is not actually available in the data. It is shown that the level-dependent threshold estimator performs well relative to the bench-mark risk, and that its minimax behaviour cannot be improved on in order of magnitude by any other estimator. The wavelet domain structure of both short- and long-range dependent noise is considered, and in both cases it is shown that the estimators have near optimal behaviour simultaneously in a wide range of function classes, adapting automatically to the regularity properties of the underlying model. The proofs of the main results are obtained by considering a more general multivariate normal decision theoretic problem.  相似文献   

13.
Some quality characteristics are well defined when treated as the response variables and their relationships are identified to some independent variables. This relationship is called a profile. The parametric models, such as linear models, may be used to model the profiles. However, due to the complexity of many processes in practical applications, it is inappropriate to model the process using parametric models. In these cases non parametric methods are used to model the processes. One of the most applicable non parametric methods used to model complicated profiles is the wavelet. Many authors considered the use of the wavelet transformation only for monitoring the processes in phase II. The problem of estimating the in-control profile in phase I using wavelet transformation is not deeply addressed. Usually classical estimators are used in phase I to estimate the in-control profiles, even when the wavelet transformation is used. These estimators are suitable if the data do not contain outliers. However, when the outliers exist, these estimators cannot estimate the in-control profile properly. In this research, a robust method of estimating the in-control profiles is proposed, which is insensitive to the presence of outliers and could be applied when the wavelet transformation is used. The proposed estimator is the combination of the robust clustering and the S-estimator. This estimator is compared with the classical estimator of the in-control profile in the presence of outliers. The results from a large simulation study show that using the proposed method, one can estimate the in-control profile precisely when the data are contaminated either locally or globally.  相似文献   

14.
This article presents a Bayesian method to reconstruct the centerline in noisy data using B-spline curves. The method is illustrated on simulated two- and three-dimensional data and is applied to recover the centerline of the colon in single photon emission computed tomography images.  相似文献   

15.
In studies that involve censored time-to-event data, stratification is frequently encountered due to different reasons, such as stratified sampling or model adjustment due to violation of model assumptions. Often, the main interest is not in the clustering variables, and the cluster-related parameters are treated as nuisance. When inference is about a parameter of interest in presence of many nuisance parameters, standard likelihood methods often perform very poorly and may lead to severe bias. This problem is particularly evident in models for clustered data with cluster-specific nuisance parameters, when the number of clusters is relatively high with respect to the within-cluster size. However, it is still unclear how the presence of censoring would affect this issue. We consider clustered failure time data with independent censoring, and propose frequentist inference based on an integrated likelihood. We then apply the proposed approach to a stratified Weibull model. Simulation studies show that appropriately defined integrated likelihoods provide very accurate inferential results in all circumstances, such as for highly clustered data or heavy censoring, even in extreme settings where standard likelihood procedures lead to strongly misleading results. We show that the proposed method performs generally as well as the frailty model, but it is superior when the frailty distribution is seriously misspecified. An application, which concerns treatments for a frequent disease in late-stage HIV-infected people, illustrates the proposed inferential method in Weibull regression models, and compares different inferential conclusions from alternative methods.  相似文献   

16.
Robust mixture modelling using the t distribution   总被引:2,自引:0,他引:2  
Normal mixture models are being increasingly used to model the distributions of a wide variety of random phenomena and to cluster sets of continuous multivariate data. However, for a set of data containing a group or groups of observations with longer than normal tails or atypical observations, the use of normal components may unduly affect the fit of the mixture model. In this paper, we consider a more robust approach by modelling the data by a mixture of t distributions. The use of the ECM algorithm to fit this t mixture model is described and examples of its use are given in the context of clustering multivariate data in the presence of atypical observations in the form of background noise.  相似文献   

17.
In this paper, we propose a penalized likelihood method to simultaneous select covariate, and mixing component and obtain parameter estimation in the localized mixture of experts models. We develop an expectation maximization algorithm to solve the proposed penalized likelihood procedure, and introduce a data-driven procedure to select the tuning parameters. Extensive numerical studies are carried out to compare the finite sample performances of our proposed method and other existing methods. Finally, we apply the proposed methodology to analyze the Boston housing price data set and the baseball salaries data set.  相似文献   

18.
In recent years, wavelet shrinkage has become a very appealing method for data de-noising and density function estimation. In particular, Bayesian modelling via hierarchical priors has introduced novel approaches for Wavelet analysis that had become very popular, and are very competitive with standard hard or soft thresholding rules. In this sense, this paper proposes a hierarchical prior that is elicited on the model parameters describing the wavelet coefficients after applying a Discrete Wavelet Transformation (DWT). In difference to other approaches, the prior proposes a multivariate Normal distribution with a covariance matrix that allows for correlations among Wavelet coefficients corresponding to the same level of detail. In addition, an extra scale parameter is incorporated that permits an additional shrinkage level over the coefficients. The posterior distribution for this shrinkage procedure is not available in closed form but it is easily sampled through Markov chain Monte Carlo (MCMC) methods. Applications on a set of test signals and two noisy signals are presented.  相似文献   

19.
The aim of this paper is to explore variable selection approaches in the partially linear proportional hazards model for multivariate failure time data. A new penalised pseudo-partial likelihood method is proposed to select important covariates. Under certain regularity conditions, we establish the rate of convergence and asymptotic normality of the resulting estimates. We further show that the proposed procedure can correctly select the true submodel, as if it was known in advance. Both simulated and real data examples are presented to illustrate the proposed methodology.  相似文献   

20.
Modelling udder infection data using copula models for quadruples   总被引:1,自引:0,他引:1  
We study copula models for correlated infection times in the four udder quarters of dairy cows. Both a semi-parametric and a nonparametric approach are considered to estimate the marginal survival functions, taking into account the effect of a binary udder quarter level covariate. We use a two-stage estimation approach and we briefly discuss the asymptotic behaviour of the estimators obtained in the first and the second stage of the estimation. A pseudo-likelihood ratio test is used to select an appropriate copula from the power variance copula family that describes the association between the outcomes in a cluster. We propose a new bootstrap algorithm to obtain the p-value for this test. This bootstrap algorithm also provides estimates for the standard errors of the estimated parameters in the copula. The proposed methods are applied to the udder infection data. A small simulation study for a setting similar to the setting of the udder infection data gives evidence that the proposed method provides a valid approach to select an appropriate copula within the power variance copula family.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号