首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In this article, a novel hybrid method to forecast stock price is proposed. This hybrid method is based on wavelet transform, wavelet denoising, linear models (autoregressive integrated moving average (ARIMA) model and exponential smoothing (ES) model), and nonlinear models (BP Neural Network and RBF Neural Network). The wavelet transform provides a set of better-behaved constitutive series than stock series for prediction. Wavelet denoising is used to eliminate some slight random fluctuations of stock series. ARIMA model and ES model are used to forecast the linear component of denoised stock series, and then BP Neural Network and RBF Neural Network are developed as tools for nonlinear pattern recognition to correct the estimation error of the prediction of linear models. The proposed method is examined in the stock market of Shanghai and Shenzhen and the results are compared with some of the most recent stock price forecast methods. The results show that the proposed hybrid method can provide a considerable improvement for the forecasting accuracy. Meanwhile, this proposed method can also be applied to analysis and forecast reliability of products or systems and improve the accuracy of reliability engineering.  相似文献   

2.
Using wavelets for data generation   总被引:4,自引:1,他引:3  
Wavelets are proposed as a non-parametric data generation tool. The idea behind the suggested method is decomposition of data into its details and later reconstruction by summation of the details randomly to generate new data. A Haar wavelet is used because of its simplicity. The method is applied to annual and monthly streamflow series taken from Turkey and USA. It is found to give good results for non-skewed data, as well as in the presence of auto-correlation.  相似文献   

3.
This article shows how a non-decimated wavelet packet transform (NWPT) can be used to model a response time series, Y t, in terms of an explanatory time series, X t. The proposed computational technique transforms the explanatory time series into a NWPT representation and then uses standard statistical modelling methods to identify which wavelet packets are useful for modelling the response time series. We exhibit S-Plus functions from the freeware WaveThresh package that implement our methodology.The proposed modelling methodology is applied to an important problem from the wind energy industry: how to model wind speed at a target location using wind speed and direction from a reference location. Our method improves on existing target site wind speed predictions produced by widely used industry standard techniques. However, of more importance, our NWPT representation produces models to which we can attach physical and scientific interpretations and in the wind example enable us to understand more about the transfer of wind energy from site to site.  相似文献   

4.
Summary.  Wavelet shrinkage is an effective nonparametric regression technique, especially when the underlying curve has irregular features such as spikes or discontinuities. The basic idea is simple: take the discrete wavelet transform of data consisting of a signal corrupted by noise; shrink or remove the wavelet coefficients to remove the noise; then invert the discrete wavelet transform to form an estimate of the true underlying curve. Various researchers have proposed increasingly sophisticated methods of doing this by using real-valued wavelets. Complex-valued wavelets exist but are rarely used. We propose two new complex-valued wavelet shrinkage techniques: one based on multiwavelet style shrinkage and the other using Bayesian methods. Extensive simulations show that our methods almost always give significantly more accurate estimates than methods based on real-valued wavelets. Further, our multiwavelet style shrinkage method is both simpler and dramatically faster than its competitors. To understand the excellent performance of this method we present a new risk bound on its hard thresholded coefficients.  相似文献   

5.
In this investigation, extracted features ofsignals have been analyzed for the recognition of arm movements. Short-time Fourier transform and wavelet transform based on Euclidian distance were applied to reordered signals. Results show that wavelet is a more useful and powerful tool for analyzing signals, since it shows multiresolution property with a significant reduction in the computation time for eliminating resolution problems. Finally, a statistical technique of repeated factorial analysis of variance for experimental recorded data was implemented in a way to investigate the effect of class separability for multiple motions for establishing surface electromyogram–muscular force relationship.  相似文献   

6.
Abstract

An improved forecasting model by merging two different computational models in predicting future volatility was proposed. The model integrates wavelet and EGARCH model where the pre-processing activity based on wavelet transform is performed with de-noising technique to eliminate noise in observed signal. The denoised signal is then feed into EGARCH model to forecast the volatility. The predictive capability of the proposed model is compared with the existing EGARCH model. The results show that the hybrid model has increased the accuracy of forecasting future volatility.  相似文献   

7.
 内容提要:中国股指期货的推出指日可待,交易者多了一种投资工具的同时也带来了新的风险。建立准确的金融时间序列预测模型是逐利及避险的方法之一,一直是学者专家研究的热点。本研究结合小波转换与支持向量回归,提出一个二阶段时间序列预测模型。先以离散小波框架将预测变量分解成不同尺度的多个子序列,揭示隐藏在预测变量内的信息,再以支持向量回归为工具,以这些子序列为预测变量建构SVR模型。本研究以日经225指数开盘价为预测目标,以期货开盘价为预测变量对模型进行实证研究,结果显示,该模型的预测绩效比单纯SVR模型及随机漫步模型好。未来可尝试以不同的基底函数作进一步研究。  相似文献   

8.
Wavelet shrinkage for unequally spaced data   总被引:4,自引:0,他引:4  
Wavelet shrinkage (WaveShrink) is a relatively new technique for nonparametric function estimation that has been shown to have asymptotic near-optimality properties over a wide class of functions. As originally formulated by Donoho and Johnstone, WaveShrink assumes equally spaced data. Because so many statistical applications (e.g., scatterplot smoothing) naturally involve unequally spaced data, we investigate in this paper how WaveShrink can be adapted to handle such data. Focusing on the Haar wavelet, we propose four approaches that extend the Haar wavelet transform to the unequally spaced case. Each approach is formulated in terms of continuous wavelet basis functions applied to a piecewise constant interpolation of the observed data, and each approach leads to wavelet coefficients that can be computed via a matrix transform of the original data. For each approach, we propose a practical way of adapting WaveShrink. We compare the four approaches in a Monte Carlo study and find them to be quite comparable in performance. The computationally simplest approach (isometric wavelets) has an appealing justification in terms of a weighted mean square error criterion and readily generalizes to wavelets of higher order than the Haar.  相似文献   

9.
Some quality characteristics are well defined when treated as the response variables and their relationships are identified to some independent variables. This relationship is called a profile. The parametric models, such as linear models, may be used to model the profiles. However, due to the complexity of many processes in practical applications, it is inappropriate to model the process using parametric models. In these cases non parametric methods are used to model the processes. One of the most applicable non parametric methods used to model complicated profiles is the wavelet. Many authors considered the use of the wavelet transformation only for monitoring the processes in phase II. The problem of estimating the in-control profile in phase I using wavelet transformation is not deeply addressed. Usually classical estimators are used in phase I to estimate the in-control profiles, even when the wavelet transformation is used. These estimators are suitable if the data do not contain outliers. However, when the outliers exist, these estimators cannot estimate the in-control profile properly. In this research, a robust method of estimating the in-control profiles is proposed, which is insensitive to the presence of outliers and could be applied when the wavelet transformation is used. The proposed estimator is the combination of the robust clustering and the S-estimator. This estimator is compared with the classical estimator of the in-control profile in the presence of outliers. The results from a large simulation study show that using the proposed method, one can estimate the in-control profile precisely when the data are contaminated either locally or globally.  相似文献   

10.
Discrimination measures have been well developed for stationary time series. However in a large number of phenomena, long-term dependencies are involved. In this article, we are dealing with discrimination of fractional integrated models. Kullback–Leibler and Chernoff's discrimination measures are approximated, using the discrete wavelet transform (DWT) for discrimination of these time series classes. The simulation study indicates low misclassification rate, related to the approximations of Kullback–Leibler and Chernoff discrimination measures. Application to problem of classifying seismic data showed that our procedure performs as well as other procedures.  相似文献   

11.
Locally stationary wavelet (LSW) processes, built on non-decimated wavelets, can be used to analyse and forecast non-stationary time series. They have been proved useful in the analysis of financial data. In this paper, we first carry out a sensitivity analysis, then propose some practical guidelines for choosing the wavelet bases for these processes. The existing forecasting algorithm is found to be vulnerable to outliers, and a new algorithm is proposed to overcome the weakness. The new algorithm is shown to be stable and outperforms the existing algorithm when applied to real financial data. The volatility forecasting ability of LSW modelling based on our new algorithm is then discussed and shown to be competitive with traditional GARCH models.  相似文献   

12.
Wavelets are a commonly used tool in science and technology. Often, their use involves applying a wavelet transform to the data, thresholding the coefficients and applying the inverse transform to obtain an estimate of the desired quantities. In this paper, we argue that it is often possible to gain more insight into the data by producing not just one, but many wavelet reconstructions using a range of threshold values and analysing the resulting object, which we term the Time–Threshold Map (TTM) of the input data. We discuss elementary properties of the TTM, in its “basic” and “derivative” versions, using both Haar and Unbalanced Haar wavelet families. We then show how the TTM can help in solving two statistical problems in the signal + noise model: breakpoint detection, and estimating the longest interval of approximate stationarity. We illustrate both applications with examples involving volatility of financial returns. We also briefly discuss other possible uses of the TTM.  相似文献   

13.
Classical nondecimated wavelet transforms are attractive for many applications. When the data comes from complex or irregular designs, the use of second generation wavelets in nonparametric regression has proved superior to that of classical wavelets. However, the construction of a nondecimated second generation wavelet transform is not obvious. In this paper we propose a new ‘nondecimated’ lifting transform, based on the lifting algorithm which removes one coefficient at a time, and explore its behavior. Our approach also allows for embedding adaptivity in the transform, i.e. wavelet functions can be constructed such that their smoothness adjusts to the local properties of the signal. We address the problem of nonparametric regression and propose an (averaged) estimator obtained by using our nondecimated lifting technique teamed with empirical Bayes shrinkage. Simulations show that our proposed method has higher performance than competing techniques able to work on irregular data. Our construction also opens avenues for generating a ‘best’ representation, which we shall explore.  相似文献   

14.
ABSTRACT

In this paper, we develop an efficient wavelet-based regularized linear quantile regression framework for coefficient estimations, where the responses are scalars and the predictors include both scalars and function. The framework consists of two important parts: wavelet transformation and regularized linear quantile regression. Wavelet transform can be used to approximate functional data through representing it by finite wavelet coefficients and effectively capturing its local features. Quantile regression is robust for response outliers and heavy-tailed errors. In addition, comparing with other methods it provides a more complete picture of how responses change conditional on covariates. Meanwhile, regularization can remove small wavelet coefficients to achieve sparsity and efficiency. A novel algorithm, Alternating Direction Method of Multipliers (ADMM) is derived to solve the optimization problems. We conduct numerical studies to investigate the finite sample performance of our method and applied it on real data from ADHD studies.  相似文献   

15.
Single cohort stage‐frequency data are considered when assessing the stage reached by individuals through destructive sampling. For this type of data, when all hazard rates are assumed constant and equal, Laplace transform methods have been applied in the past to estimate the parameters in each stage‐duration distribution and the overall hazard rates. If hazard rates are not all equal, estimating stage‐duration parameters using Laplace transform methods becomes complex. In this paper, two new models are proposed to estimate stage‐dependent maturation parameters using Laplace transform methods where non‐trivial hazard rates apply. The first model encompasses hazard rates that are constant within each stage but vary between stages. The second model encompasses time‐dependent hazard rates within stages. Moreover, this paper introduces a method for estimating the hazard rate in each stage for the stage‐wise constant hazard rates model. This work presents methods that could be used in specific types of laboratory studies, but the main motivation is to explore the relationships between stage maturation parameters that, in future work, could be exploited in applying Bayesian approaches. The application of the methodology in each model is evaluated using simulated data in order to illustrate the structure of these models.  相似文献   

16.
The diffusion process is a widely used statistical model for many natural dynamic phenomena but its inference is very complicated because complete data describing the diffusion sample path is not necessarily available. In addition, data is often collected with substantial uncertainty and it is not uncommon to have missing observations. Thus, the observed process will be discrete over a finite time period and the marginal likelihood given by this discrete data is not always available. In this paper, we consider a class of nonstationary diffusion process models with not only the measurement error but also discretely time-varying parameters which are modeled via a state space model. Hierarchical Bayesian inference for such a diffusion process model with time-varying parameters is applied to financial data.  相似文献   

17.
Multi-stage time evolving models are common statistical models for biological systems, especially insect populations. In stage-duration distribution models, parameter estimation for the models use the Laplace transform method. This method involves assumptions such as known constant shapes, known constant rates or the same overall hazard rate for all stages. These assumptions are strong and restrictive. The main aim of this paper is to weaken these assumptions by using a Bayesian approach. In particular, a Metropolis-Hastings algorithm based on deterministic transformations is used to estimate parameters. We will use two models, one which has no hazard rates, and the other has stage-wise constant hazard rates. These methods are validated in simulation studies followed by a case study of cattle parasites. The results show that the proposed methods are able to estimate the parameters comparably well, as opposed to using the Laplace transform methods.  相似文献   

18.
We introduce a new goodness-of-fit test which can be applied to hypothesis testing about the marginal distribution of dependent data. We derive a new test for the equivalent hypothesis in the space of wavelet coefficients. Such properties of the wavelet transform as orthogonality, localisation and sparsity make the hypothesis testing in wavelet domain easier than in the domain of distribution functions. We propose to test the null hypothesis separately at each wavelet decomposition level to overcome the problem of bi-dimensionality of wavelet indices and to be able to find the frequency where the empirical distribution function differs from the null in case the null hypothesis is rejected. We suggest a test statistic and state its asymptotic distribution under the null and under some of the alternative hypotheses.  相似文献   

19.
Long memory has been widely documented for realized financial market volatility. As a novelty, we consider daily realized asset correlations and we investigate whether the observed persistence is (i) due to true long memory (i.e. fractional integration) or (ii) artificially generated by some structural break processes. These two phenomena are difficult to be distinguished in practice. Our empirical results strongly indicate that the hyperbolic decay of the autocorrelation functions of pair-wise realized correlation series is indeed not driven by a truly fractionally integrated process. This finding is robust against user specific parameter choices in the applied test statistic and holds for all 15 considered time series. As a next step, we apply simple models with deterministic level shifts. When selecting the number of breaks, estimating the breakpoints and the corresponding structural break models we find a substantial degree of co-movement between the realized correlation series hinting at co-breaking. The estimated structural break models are interpreted in the light of the historic economic and financial development.  相似文献   

20.
Distance sampling and capture–recapture are the two most widely used wildlife abundance estimation methods. capture–recapture methods have only recently incorporated models for spatial distribution and there is an increasing tendency for distance sampling methods to incorporated spatial models rather than to rely on partly design-based spatial inference. In this overview we show how spatial models are central to modern distance sampling and that spatial capture–recapture models arise as an extension of distance sampling methods. Depending on the type of data recorded, they can be viewed as particular kinds of hierarchical binary regression, Poisson regression, survival or time-to-event models, with individuals’ locations as latent variables and a spatial model as the latent variable distribution. Incorporation of spatial models in these two methods provides new opportunities for drawing explicitly spatial inferences. Areas of likely future development include more sophisticated spatial and spatio-temporal modelling of individuals’ locations and movements, new methods for integrating spatial capture–recapture and other kinds of ecological survey data, and methods for dealing with the recapture uncertainty that often arise when “capture” consists of detection by a remote device like a camera trap or microphone.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号