首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
We present a new statistical framework for landmark ?>curve-based image registration and surface reconstruction. The proposed method first elastically aligns geometric features (continuous, parameterized curves) to compute local deformations, and then uses a Gaussian random field model to estimate the full deformation vector field as a spatial stochastic process on the entire surface or image domain. The statistical estimation is performed using two different methods: maximum likelihood and Bayesian inference via Markov Chain Monte Carlo sampling. The resulting deformations accurately match corresponding curve regions while also being sufficiently smooth over the entire domain. We present several qualitative and quantitative evaluations of the proposed method on both synthetic and real data. We apply our approach to two different tasks on real data: (1) multimodal medical image registration, and (2) anatomical and pottery surface reconstruction.  相似文献   

2.
ABSTRACT

We present methods for modeling and estimation of a concurrent functional regression when the predictors and responses are two-dimensional functional datasets. The implementations use spline basis functions and model fitting is based on smoothing penalties and mixed model estimation. The proposed methods are implemented in available statistical software, allow the construction of confidence intervals for the bivariate model parameters, and can be applied to completely or sparsely sampled responses. Methods are tested to data in simulations and they show favorable results in practice. The usefulness of the methods is illustrated in an application to environmental data.  相似文献   

3.
Recently, Zhang [Simultaneous confidence intervals for several inverse Gaussian populations. Stat Probab Lett. 2014;92:125–131] proposed simultaneous pairwise confidence intervals (SPCIs) based on the fiducial generalized pivotal quantity concept to make inferences about the inverse Gaussian means under heteroscedasticity. In this paper, we propose three new methods for constructing SPCIs to make inferences on the means of several inverse Gaussian distributions when scale parameters and sample sizes are unequal. One of the methods results in a set of classic SPCIs (in the sense that it is not simulation-based inference) and the two others are based on a parametric bootstrap approach. The advantages of our proposed methods over Zhang’s (2014) method are: (i) the simulation results show that the coverage probability of the proposed parametric bootstrap approaches is fairly close to the nominal confidence coefficient while the coverage probability of Zhang’s method is smaller than the nominal confidence coefficient when the number of groups and the variance of groups are large and (ii) the proposed set of classic SPCIs is conservative in contrast to Zhang’s method.  相似文献   

4.
S. Huet 《Statistics》2015,49(2):239-266
We propose a procedure to test that the expectation of a Gaussian vector is linear against a nonparametric alternative. We consider the case where the covariance matrix of the observations has a block diagonal structure. This framework encompasses regression models with autocorrelated errors, heteroscedastic regression models, mixed-effects models and growth curves. Our procedure does not depend on any prior information about the alternative. We prove that the test is asymptotically of the nominal level and consistent. We characterize the set of vectors on which the test is powerful and prove the classical √log log (n)/n convergence rate over directional alternatives. We propose a bootstrap version of the test as an alternative to the initial one and provide a simulation study in order to evaluate both procedures for small sample sizes when the purpose is to test goodness of fit in a Gaussian mixed-effects model. Finally, we illustrate the procedures using a real data set.  相似文献   

5.
This paper considers a hierarchical Bayesian analysis of regression models using a class of Gaussian scale mixtures. This class provides a robust alternative to the common use of the Gaussian distribution as a prior distribution in particular for estimating the regression function subject to uncertainty about the constraint. For this purpose, we use a family of rectangular screened multivariate scale mixtures of Gaussian distribution as a prior for the regression function, which is flexible enough to reflect the degrees of uncertainty about the functional constraint. Specifically, we propose a hierarchical Bayesian regression model for the constrained regression function with uncertainty on the basis of three stages of a prior hierarchy with Gaussian scale mixtures, referred to as a hierarchical screened scale mixture of Gaussian regression models (HSMGRM). We describe distributional properties of HSMGRM and an efficient Markov chain Monte Carlo algorithm for posterior inference, and apply the proposed model to real applications with constrained regression models subject to uncertainty.  相似文献   

6.
Autoregressive Forecasting of Some Functional Climatic Variations   总被引:4,自引:0,他引:4  
Many variations such as the annual cycle in sea surface temperatures can be considered to be smooth functions and are appropriately described using methods from functional data analysis. This study defines a class of functional autoregressive (FAR) models which can be used as robust predictors for making forecasts of entire smooth functions in the future. The methods are illustrated and compared with pointwise predictors such as SARIMA by applying them to forecasting the entire annual cycle of climatological El Nino–Southern Oscillation (ENSO) time series one year ahead. Forecasts for the period 1987–1996 suggest that the FAR functional predictors show some promising skill, compared to traditional scalar SARIMA forecasts which perform poorly.  相似文献   

7.
In this article, we propose a novel approach to fit a functional linear regression in which both the response and the predictor are functions. We consider the case where the response and the predictor processes are both sparsely sampled at random time points and are contaminated with random errors. In addition, the random times are allowed to be different for the measurements of the predictor and the response functions. The aforementioned situation often occurs in longitudinal data settings. To estimate the covariance and the cross‐covariance functions, we use a regularization method over a reproducing kernel Hilbert space. The estimate of the cross‐covariance function is used to obtain estimates of the regression coefficient function and of the functional singular components. We derive the convergence rates of the proposed cross‐covariance, the regression coefficient, and the singular component function estimators. Furthermore, we show that, under some regularity conditions, the estimator of the coefficient function has a minimax optimal rate. We conduct a simulation study and demonstrate merits of the proposed method by comparing it to some other existing methods in the literature. We illustrate the method by an example of an application to a real‐world air quality dataset. The Canadian Journal of Statistics 47: 524–559; 2019 © 2019 Statistical Society of Canada  相似文献   

8.
Data in many experiments arises as curves and therefore it is natural to use a curve as a basic unit in the analysis, which is in terms of functional data analysis (FDA). Functional curves are encountered when units are observed over time. Although the whole function curve itself is not observed, a sufficiently large number of evaluations, as is common with modern recording equipment, is assumed to be available. In this article, we consider the statistical inference for the mean functions in the two samples problem drawn from functional data sets, in which we assume that functional curves are observed, that is, we consider the test if these two groups of curves have the same mean functional curve when the two groups of curves without noise are observed. The L 2-norm based and bootstrap-based test statistics are proposed. It is shown that the proposed methodology is flexible. Simulation study and real-data examples are used to illustrate our techniques.  相似文献   

9.
The threshold diffusion model assumes a piecewise linear drift term and a piecewise smooth diffusion term, which constitutes a rich model for analyzing nonlinear continuous-time processes. We consider the problem of testing for threshold nonlinearity in the drift term. We do this by developing a quasi-likelihood test derived under the working assumption of a constant diffusion term, which circumvents the problem of generally unknown functional form for the diffusion term. The test is first developed for testing for one threshold at which the drift term breaks into two linear functions. We show that under some mild regularity conditions, the asymptotic null distribution of the proposed test statistic is given by the distribution of certain functional of some centered Gaussian process. We develop a computationally efficient method for calibrating the p-value of the test statistic by bootstrapping its asymptotic null distribution. The local power function is also derived, which establishes the consistency of the proposed test. The test is then extended to testing for multiple thresholds. We demonstrate the efficacy of the proposed test by simulations. Using the proposed test, we examine the evidence of nonlinearity in the term structure of a long time series of U.S. interest rates.  相似文献   

10.
This empirical paper presents a number of functional modelling and forecasting methods for predicting very short-term (such as minute-by-minute) electricity demand. The proposed functional methods slice a seasonal univariate time series (TS) into a TS of curves; reduce the dimensionality of curves by applying functional principal component analysis before using a univariate TS forecasting method and regression techniques. As data points in the daily electricity demand are sequentially observed, a forecast updating method can greatly improve the accuracy of point forecasts. Moreover, we present a non-parametric bootstrap approach to construct and update prediction intervals, and compare the point and interval forecast accuracy with some naive benchmark methods. The proposed methods are illustrated by the half-hourly electricity demand from Monday to Sunday in South Australia.  相似文献   

11.
Laplace motion is a Lévy process built upon Laplace distributions. Non Gaussian stochastic fields that are integrals with respect to this process are considered and methods for their model fitting are discussed. The proposed procedures allow for inference about the parameters of the underlying Laplace distributions. A fit of dependence structure is also addressed. The importance of a convenient parameterization that admits natural and consistent estimation for this class of models is emphasized. Several parameterizations are introduced and their advantages over one another discussed. The proposed estimation method targets the standard characteristics: mean, variance, skewness and kurtosis. Their sample equivalents are matched in the closest possible way as allowed by natural constraints within this class. A simulation study and an example of potential applications conclude the article.  相似文献   

12.
The most common forecasting methods in business are based on exponential smoothing, and the most common time series in business are inherently non‐negative. Therefore it is of interest to consider the properties of the potential stochastic models underlying exponential smoothing when applied to non‐negative data. We explore exponential smoothing state space models for non‐negative data under various assumptions about the innovations, or error, process. We first demonstrate that prediction distributions from some commonly used state space models may have an infinite variance beyond a certain forecasting horizon. For multiplicative error models that do not have this flaw, we show that sample paths will converge almost surely to zero even when the error distribution is non‐Gaussian. We propose a new model with similar properties to exponential smoothing, but which does not have these problems, and we develop some distributional properties for our new model. We then explore the implications of our results for inference, and compare the short‐term forecasting performance of the various models using data on the weekly sales of over 300 items of costume jewelry. The main findings of the research are that the Gaussian approximation is adequate for estimation and one‐step‐ahead forecasting. However, as the forecasting horizon increases, the approximate prediction intervals become increasingly problematic. When the model is to be used for simulation purposes, a suitably specified scheme must be employed.  相似文献   

13.
Many fields of research need to classify individual systems based on one or more data series, which are obtained by sampling an unknown continuous curve with noise. In other words, the underlying process is an unknown function which the observed variables represent only imperfectly. Although functional logistic regression has many attractive features for this classification problem, this method is applicable only when the number of individuals to be classified (or available to estimate the model) is large compared to the number of curves sampled per individual.To overcome this limitation, we use penalized optimal scoring to construct a new method for the classification of multi-dimensional functional data. The proposed method consists of two stages. First, the series of observed discrete values available for each individual are expressed as a set of continuous curves. Next, the penalized optimal scoring model is estimated on the basis of these curves. A similar penalized optimal scoring method was described in my previous work, but this model is not suitable for the analysis of continuous functions. In this paper we adopt a Gaussian kernel approach to extend the previous model. The high accuracy of the new method is demonstrated on Monte Carlo simulations, and used to predict defaulting firms on the Japanese Stock Exchange.  相似文献   

14.
Single index model conditional quantile regression is proposed in order to overcome the dimensionality problem in nonparametric quantile regression. In the proposed method, the Bayesian elastic net is suggested for single index quantile regression for estimation and variables selection. The Gaussian process prior is considered for unknown link function and a Gibbs sampler algorithm is adopted for posterior inference. The results of the simulation studies and numerical example indicate that our propose method, BENSIQReg, offers substantial improvements over two existing methods, SIQReg and BSIQReg. The BENSIQReg has consistently show a good convergent property, has the least value of median of mean absolute deviations and smallest standard deviations, compared to the other two methods.  相似文献   

15.
We propose a density-tempered marginalized sequential Monte Carlo (SMC) sampler, a new class of samplers for full Bayesian inference of general state-space models. The dynamic states are approximately marginalized out using a particle filter, and the parameters are sampled via a sequential Monte Carlo sampler over a density-tempered bridge between the prior and the posterior. Our approach delivers exact draws from the joint posterior of the parameters and the latent states for any given number of state particles and is thus easily parallelizable in implementation. We also build into the proposed method a device that can automatically select a suitable number of state particles. Since the method incorporates sample information in a smooth fashion, it delivers good performance in the presence of outliers. We check the performance of the density-tempered SMC algorithm using simulated data based on a linear Gaussian state-space model with and without misspecification. We also apply it on real stock prices using a GARCH-type model with microstructure noise.  相似文献   

16.
In this paper, functional coefficient autoregressive (FAR) models proposed by Chen and Tsay (1993) are considered. We propose a diagnostic statistic for FAR models constructed by comparing between parametric and nonparametric estimators of the functional form of the FAR models. We show asymptotic properties of our statistic mathematically and it can be applied to the estimation of the delay parameter and the specification of the functional form of FAR models.  相似文献   

17.
Abstract. We review and extend some statistical tools that have proved useful for analysing functional data. Functional data analysis primarily is designed for the analysis of random trajectories and infinite‐dimensional data, and there exists a need for the development of adequate statistical estimation and inference techniques. While this field is in flux, some methods have proven useful. These include warping methods, functional principal component analysis, and conditioning under Gaussian assumptions for the case of sparse data. The latter is a recent development that may provide a bridge between functional and more classical longitudinal data analysis. Besides presenting a brief review of functional principal components and functional regression, we develop some concepts for estimating functional principal component scores in the sparse situation. An extension of the so‐called generalized functional linear model to the case of sparse longitudinal predictors is proposed. This extension includes functional binary regression models for longitudinal data and is illustrated with data on primary biliary cirrhosis.  相似文献   

18.
Summary.  The retrieval of wind vectors from satellite scatterometer observations is a non-linear inverse problem. A common approach to solving inverse problems is to adopt a Bayesian framework and to infer the posterior distribution of the parameters of interest given the observations by using a likelihood model relating the observations to the parameters, and a prior distribution over the parameters. We show how Gaussian process priors can be used efficiently with a variety of likelihood models, using local forward (observation) models and direct inverse models for the scatterometer. We present an enhanced Markov chain Monte Carlo method to sample from the resulting multimodal posterior distribution. We go on to show how the computational complexity of the inference can be controlled by using a sparse, sequential Bayes algorithm for estimation with Gaussian processes. This helps to overcome the most serious barrier to the use of probabilistic, Gaussian process methods in remote sensing inverse problems, which is the prohibitively large size of the data sets. We contrast the sampling results with the approximations that are found by using the sparse, sequential Bayes algorithm.  相似文献   

19.
Restrictions on the risk-pricing in dynamic term structure models (DTSMs) tighten the link between cross-sectional and time-series variation of interest rates, and make absence of arbitrage useful for inference about expectations. This article presents a new econometric framework for estimation of affine Gaussian DTSMs under restrictions on risk prices, which addresses the issues of a large model space and of model uncertainty using a Bayesian approach. A simulation study demonstrates the good performance of the proposed method. Data for U.S. Treasury yields calls for tight restrictions on risk pricing: only level risk is priced, and only changes in the slope affect term premia. Incorporating the restrictions changes the model-implied short-rate expectations and term premia. Interest rate persistence is higher than in a maximally flexible model, hence expectations of future short rates are more variable—restrictions on risk prices help resolve the puzzle of implausibly stable short-rate expectations in this literature. Consistent with survey evidence and conventional macro wisdom, restricted models attribute a large share of the secular decline in long-term interest rates to expectations of future nominal short rates. Supplementary materials for this article are available online.  相似文献   

20.
Structural econometric auction models with explicit game-theoretic modeling of bidding strategies have been quite a challenge from a methodological perspective, especially within the common value framework. We develop a Bayesian analysis of the hierarchical Gaussian common value model with stochastic entry introduced by Bajari and Hortaçsu. A key component of our approach is an accurate and easily interpretable analytical approximation of the equilibrium bid function, resulting in a fast and numerically stable evaluation of the likelihood function. We extend the analysis to situations with positive valuations using a hierarchical gamma model. We use a Bayesian variable selection algorithm that simultaneously samples the posterior distribution of the model parameters and does inference on the choice of covariates. The methodology is applied to simulated data and to a newly collected dataset from eBay with bids and covariates from 1000 coin auctions. We demonstrate that the Bayesian algorithm is very efficient and that the approximation error in the bid function has virtually no effect on the model inference. Both models fit the data well, but the Gaussian model outperforms the gamma model in an out-of-sample forecasting evaluation of auction prices. This article has supplementary material online.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号