首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 953 毫秒
1.
The polar plumes are very fine structures of the solar K-corona lying around the poles and visible during the period of minimum of activity. These poorly known structures are linked with the solar magnetic field and with numerous coronal phenomena such as the fast solar wind and the coronal holes. The SOHO space mission has provided some continuous observations to high cadence (each 10 min). From these observations the images of the K-corona have been derived and preprocessed with an adapted anisotropic filtering. Then, a peculiar type of sinogram called Time Intensity Diagram (TID) has been built. It is adapted to the evolution of polar plumes with the time. A multiresolution wavelet analysis of the TID has then revealed that the spatial distribution of the polar plumes as well as their temporal evolution were fractal. The present study consists in simulating polar plumes by forward modeling techniques in order to validate several assumptions concerning their nature and their temporal evolution. Our work involves two main steps. The first one concerns the simulation of polar plumes and the computation of their corresponding TID. The second one concerns the estimation of analysis criteria in order to compare the original TID and the simulated ones. Static and dynamic models were both used in order to confirm the fractal behavior of the temporal evolution of plumes. The most recent and promising model is based on a Hidden Markov Tree. It allows us to control the fractal parameters of the TID.  相似文献   

2.
Statistical inference in the wavelet domain remains a vibrant area of contemporary statistical research because of desirable properties of wavelet representations and the need of scientific community to process, explore, and summarize massive data sets. Prime examples are biomedical, geophysical, and internet related data. We propose two new approaches to wavelet shrinkage/thresholding.

In the spirit of Efron and Tibshirani's recent work on local false discovery rate, we propose Bayesian Local False Discovery Rate (BLFDR), where the underlying model on wavelet coefficients does not assume known variances. This approach to wavelet shrinkage is shown to be connected with shrinkage based on Bayes factors. The second proposal, Bayesian False Discovery Rate (BaFDR), is based on ordering of posterior probabilities of hypotheses on true wavelets coefficients being null, in Bayesian testing of multiple hypotheses.

We demonstrate that both approaches result in competitive shrinkage methods by contrasting them to some popular shrinkage techniques.  相似文献   

3.
Summary.  Wavelet shrinkage is an effective nonparametric regression technique, especially when the underlying curve has irregular features such as spikes or discontinuities. The basic idea is simple: take the discrete wavelet transform of data consisting of a signal corrupted by noise; shrink or remove the wavelet coefficients to remove the noise; then invert the discrete wavelet transform to form an estimate of the true underlying curve. Various researchers have proposed increasingly sophisticated methods of doing this by using real-valued wavelets. Complex-valued wavelets exist but are rarely used. We propose two new complex-valued wavelet shrinkage techniques: one based on multiwavelet style shrinkage and the other using Bayesian methods. Extensive simulations show that our methods almost always give significantly more accurate estimates than methods based on real-valued wavelets. Further, our multiwavelet style shrinkage method is both simpler and dramatically faster than its competitors. To understand the excellent performance of this method we present a new risk bound on its hard thresholded coefficients.  相似文献   

4.
Classical nondecimated wavelet transforms are attractive for many applications. When the data comes from complex or irregular designs, the use of second generation wavelets in nonparametric regression has proved superior to that of classical wavelets. However, the construction of a nondecimated second generation wavelet transform is not obvious. In this paper we propose a new ‘nondecimated’ lifting transform, based on the lifting algorithm which removes one coefficient at a time, and explore its behavior. Our approach also allows for embedding adaptivity in the transform, i.e. wavelet functions can be constructed such that their smoothness adjusts to the local properties of the signal. We address the problem of nonparametric regression and propose an (averaged) estimator obtained by using our nondecimated lifting technique teamed with empirical Bayes shrinkage. Simulations show that our proposed method has higher performance than competing techniques able to work on irregular data. Our construction also opens avenues for generating a ‘best’ representation, which we shall explore.  相似文献   

5.
In this paper, the problem of estimating the mean vector under non-negative constraints on location vector of the multivariate normal distribution is investigated. The value of the wavelet threshold based on Stein''s unbiased risk estimators is calculated for the shrinkage estimator in restricted parameter space. We suppose that covariance matrix is unknown and we find the dominant class of shrinkage estimators under Balance loss function. The performance evaluation of the proposed class of estimators is checked through a simulation study by using risk and average mean square error values.  相似文献   

6.
We can use wavelet shrinkage to estimate a possibly multivariate regression function g under the general regression setup, y = g + ε. We propose an enhanced wavelet-based denoising methodology based on Bayesian adaptive multiresolution shrinkage, an effective Bayesian shrinkage rule in addition to the semi-supervised learning mechanism. The Bayesian shrinkage rule is advanced by utilizing the semi-supervised learning method in which the neighboring structure of a wavelet coefficient is adopted and an appropriate decision function is derived. According to decision function, wavelet coefficients follow one of two prespecified Bayesian rules obtained using varying related parameters. The decision of a wavelet coefficient depends not only on its magnitude, but also on the neighboring structure on which the coefficient is located. We discuss the theoretical properties of the suggested method and provide recommended parameter settings. We show that the proposed method is often superior to several existing wavelet denoising methods through extensive experimentation.  相似文献   

7.
A fractal and its dimension has been a subject of great mathematical interest since the publication of Mandelbrot's manifestoes (1977, 1982). This paper discusses some empirical results indicating the potential usefulness of estimated fractal dimension in testing for white noise. These tests are applied for model identification in time series, and results for previously analyzed data are provided. A method for fractal interpolation of a continuous process from a finite number of observations is discussed, as well as some future research directions.  相似文献   

8.
ABSTRACT

In this article we derive the density and distribution functions of the stochastic shrinkage parameters of three well known operational Ridge Regression (RR) estimators by assuming normality. The stochastic behavior of these parameters is likely to affect the properties of the resulting RR estimator, therefore such knowledge can be useful in the selection of the shrinkage rule. Some numerical calculations are carried out to illustrate the behavior of these distributions, throwing light on the performance of the different RR estimators.  相似文献   

9.
Wavelet shrinkage for unequally spaced data   总被引:4,自引:0,他引:4  
Wavelet shrinkage (WaveShrink) is a relatively new technique for nonparametric function estimation that has been shown to have asymptotic near-optimality properties over a wide class of functions. As originally formulated by Donoho and Johnstone, WaveShrink assumes equally spaced data. Because so many statistical applications (e.g., scatterplot smoothing) naturally involve unequally spaced data, we investigate in this paper how WaveShrink can be adapted to handle such data. Focusing on the Haar wavelet, we propose four approaches that extend the Haar wavelet transform to the unequally spaced case. Each approach is formulated in terms of continuous wavelet basis functions applied to a piecewise constant interpolation of the observed data, and each approach leads to wavelet coefficients that can be computed via a matrix transform of the original data. For each approach, we propose a practical way of adapting WaveShrink. We compare the four approaches in a Monte Carlo study and find them to be quite comparable in performance. The computationally simplest approach (isometric wavelets) has an appealing justification in terms of a weighted mean square error criterion and readily generalizes to wavelets of higher order than the Haar.  相似文献   

10.
We develop fractal methodology for data taking the form of surfaces. An advantage of fractal analysis is that it partitions roughness characteristics of a surface into a scale-free component (fractal dimension) and properties that depend purely on scale. Particular emphasis is given to anisotropy where we show that, for many surfaces, the fractal dimension of line transects across a surface must either be constant in every direction or be constant in each direction except one. This virtual direction invariance of fractal dimension provides another canonical feature of fractal analysis, complementing its scale invariance properties and enhancing its attractiveness as a method for summarizing properties of roughness. The dependence of roughness on direction may be explained in terms of scale rather than dimension and can vary with orientation. Scale may be described by a smooth periodic function and may be estimated nonparametrically. Our results and techniques are applied to analyse data on the surfaces of soil and plastic food wrapping. For the soil data, interest centres on the effect of surface roughness on retention of rain-water, and data are recorded as a series of digital images over time. Our analysis captures the way in which both the fractal dimension and the scale change with rainfall, or equivalently with time. The food wrapping data are on a much finer scale than the soil data and are particularly anisotropic. The analysis allows us to determine the manufacturing process which produces the smoothest wrapping, with least tendency for micro-organisms to adhere.  相似文献   

11.
A wavelet method is proposed for recovering damaged images. The proposed method combines wavelet shrinkage with preprocessing based on a binning process and an imputation procedure that is designed to extend the scope of wavelet shrinkage to data with missing values and perturbed locations. The proposed algorithm, termed as the BTW algorithm is simple to implement and efficient for recovering an image. Furthermore, this algorithm can be easily applied to wavelet regression for one-dimensional (1-D) signal estimation with irregularly spaced data. Simulation studies and real examples show that the proposed method can produce substantially effective results.  相似文献   

12.
This paper considers the problem of selecting a robust threshold of wavelet shrinkage. Previous approaches reported in literature to handle the presence of outliers mainly focus on developing a robust procedure for a given threshold; this is related to solving a nontrivial optimization problem. The drawback of this approach is that the selection of a robust threshold, which is crucial for the resulting fit is ignored. This paper points out that the best fit can be achieved by a robust wavelet shrinkage with a robust threshold. We propose data-driven selection methods for a robust threshold. These approaches are based on a coupling of classical wavelet thresholding rules with pseudo data. The concept of pseudo data has influenced the implementation of the proposed methods, and provides a fast and efficient algorithm. Results from a simulation study and a real example demonstrate the promising empirical properties of the proposed approaches.  相似文献   

13.
We introduce a new goodness-of-fit test which can be applied to hypothesis testing about the marginal distribution of dependent data. We derive a new test for the equivalent hypothesis in the space of wavelet coefficients. Such properties of the wavelet transform as orthogonality, localisation and sparsity make the hypothesis testing in wavelet domain easier than in the domain of distribution functions. We propose to test the null hypothesis separately at each wavelet decomposition level to overcome the problem of bi-dimensionality of wavelet indices and to be able to find the frequency where the empirical distribution function differs from the null in case the null hypothesis is rejected. We suggest a test statistic and state its asymptotic distribution under the null and under some of the alternative hypotheses.  相似文献   

14.
In recent years, wavelet shrinkage has become a very appealing method for data de-noising and density function estimation. In particular, Bayesian modelling via hierarchical priors has introduced novel approaches for Wavelet analysis that had become very popular, and are very competitive with standard hard or soft thresholding rules. In this sense, this paper proposes a hierarchical prior that is elicited on the model parameters describing the wavelet coefficients after applying a Discrete Wavelet Transformation (DWT). In difference to other approaches, the prior proposes a multivariate Normal distribution with a covariance matrix that allows for correlations among Wavelet coefficients corresponding to the same level of detail. In addition, an extra scale parameter is incorporated that permits an additional shrinkage level over the coefficients. The posterior distribution for this shrinkage procedure is not available in closed form but it is easily sampled through Markov chain Monte Carlo (MCMC) methods. Applications on a set of test signals and two noisy signals are presented.  相似文献   

15.
基于巨灾损失具有厚尾分布的特征,采用POT极值模型分别估计两个保险标的的边缘分布,并用二元Copula函数刻画这两个标的的关联性,同时应用Monte Carlo模拟方法估算巨灾再保险的纯保费。通过对洪水损失数据的实证分析表明:Clayton Copula函数能较好地反映两标的间的相关结构;起赔点的设定是影响纯保费的重要因素,且起赔点按条件分位点取值更优更合理。研究结果对保险人开发多元保险标的的巨灾再保险具有重要的参考价值。  相似文献   

16.
This article introduces a fast cross-validation algorithm that performs wavelet shrinkage on data sets of arbitrary size and irregular design and also simultaneously selects good values of the primary resolution and number of vanishing moments.We demonstrate the utility of our method by suggesting alternative estimates of the conditional mean of the well-known Ethanol data set. Our alternative estimates outperform the Kovac-Silverman method with a global variance estimate by 25% because of the careful selection of number of vanishing moments and primary resolution. Our alternative estimates are simpler than, and competitive with, results based on the Kovac-Silverman algorithm equipped with a local variance estimate.We include a detailed simulation study that illustrates how our cross-validation method successfully picks good values of the primary resolution and number of vanishing moments for unknown functions based on Walsh functions (to test the response to changing primary resolution) and piecewise polynomials with zero or one derivative (to test the response to function smoothness).  相似文献   

17.
Abstract.  A blockwise shrinkage is a popular adaptive procedure for non-parametric series estimates. It possesses an impressive range of asymptotic properties, and there is a vast pool of blocks and shrinkage procedures used. Traditionally these estimates are studied via upper bounds on their risks. This article suggests the study of these adaptive estimates via non-asymptotic lower bounds established for a spike underlying function that plays a pivotal role in the wavelet and minimax statistics. While upper-bound inequalities help the statistician to find sufficient conditions for a desirable estimation, the non-asymptotic lower bounds yield necessary conditions and shed a new light on the popular method of adaptation. The suggested method complements and knits together two traditional techniques used in the analysis of adaptive estimates: a numerical study and an asymptotic minimax inference.  相似文献   

18.
上海股票市场分形特征的实证研究   总被引:2,自引:0,他引:2  
以有效市场假说为基础的现代资本市场理论被越来越多的实践证明与现实情况不符,而分形理论则考虑到资本市场的复杂性和EMH的缺陷,以非线性范式为分析基础,解释了有效市场理论无法解释的许多市场现象,为更深入地分析资本市场提供了新的思路和方法。文章以上海股票市场为例,用分形理论对上海股市的分形特征进行了研究,其结果表明上海股票市场具有明显的分形特征。  相似文献   

19.
Uniformly minimum-variance unbiased (UMVU) estimators of the total risk and the mean-squared-error (MSE) matrix of the Stein estimator for the multivariate normal mean with unknown covariance matrix are proposed. The estimated MSE matrix is helpful in identifying the components which contribute most to the total risk. It also contains information about the performance of the shrinkage estimator with respect to other quadratic loss functions.  相似文献   

20.
In this paper we propose two shrinkage testimators for the reliability of the exponential distribution and study their properties. The optimum shrinkage coefficients for the shrinkage testimators are obtained based on a regret function and the minimax regret criterion. Shrinkage testimators are compared with a preliminary test estimator and with the usual estimator in terms of mean squared error. The proposed shrinkage testimators are shown to be preferable to the preliminary test estimator and the usual estimator when the prior value of mean life is close to the true mean life.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号