首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In this paper, we discuss an approach to the pattern synthesis and analysis of biological shapes which show a lot of variability, but at the same time also show a characteristic structure. This structure is captured by way of ‘shape classes' which are constructed via a deformable template. Examples of pattern analysis in both two and three dimensions are presented. We define what we call intrinsic and extrinsic understanding of images and apply this to the detection of abnormalities.  相似文献   

2.
There are many well-known methods applied in classification problem for linear data with both known and unknown distribution. Here, we deal with classification involving data on torus and cylinder. A new method involving a generalized likelihood ratio test is developed for classifying in two populations using directional data. The approach assumes that one of the probabilities of misclassification is known. The procedure is constructed by applying Gibbs sampler on the conditionally specified distribution. A parametric bootstrap approach is also presented. An application to data involving linear and circular measurements on human skull from two tribal populations is given.  相似文献   

3.
A nonasymptotic Bayesian approach is developed for analysis of data from threshold autoregressive processes with two regimes. Using the conditional likelihood function, the marginal posterior distribution for each of the parameters is derived along with posterior means and variances. A test for linear functions of the autoregressive coefficients is presented. The approach presented uses a posterior p-value averaged over the values of the threshold. The one-step ahead predictive distribution is derived along with the predictive mean and variance. In addition, equivalent results are derived conditional upon a value of the threshold. A numerical example is presented to illustrate the approach.  相似文献   

4.
We investigate a Bayesian method for the segmentation of muscle fibre images. The images are reasonably well approximated by a Dirichlet tessellation, and so we use a deformable template model based on Voronoi polygons to represent the segmented image. We consider various prior distributions for the parameters and suggest an appropriate likelihood. Following the Bayesian paradigm, the mathematical form for the posterior distribution is obtained (up to an integrating constant). We introduce a Metropolis-Hastings algorithm and a reversible jump Markov chain Monte Carlo algorithm (RJMCMC) for simulation from the posterior when the number of polygons is fixed or unknown. The particular moves in the RJMCMC algorithm are birth, death and position/colour changes of the point process which determines the location of the polygons. Segmentation of the true image was carried out using the estimated posterior mode and posterior mean. A simulation study is presented which is helpful for tuning the hyperparameters and to assess the accuracy. The algorithms work well on a real image of a muscle fibre cross-section image, and an additional parameter, which models the boundaries of the muscle fibres, is included in the final model.  相似文献   

5.
We propose a mixture modelling framework for both identifying and exploring the nature of genotype-trait associations. This framework extends the classical mixed effects modelling approach for this setting by incorporating a Gaussian mixture distribution for random genotype effects. The primary advantages of this paradigm over existing approaches include that the mixture modelling framework addresses the degrees-of-freedom challenge that is inherent in application of the usual fixed effects analysis of covariance, relaxes the restrictive single normal distribution assumption of the classical mixed effects models and offers an exploratory framework for discovery of underlying structure across multiple genetic loci. An application to data arising from a study of antiretroviral-associated dyslipidaemia in human immunodeficiency virus infection is presented. Extensive simulations studies are also implemented to investigate the performance of this approach.  相似文献   

6.
This paper is about object deformations observed throughout a sequence of images. We present a statistical framework in which the observed images are defined as noisy realizations of a randomly deformed template image. In this framework, we focus on the problem of the estimation of parameters related to the template and deformations. Our main motivation is the construction of estimation framework and algorithm which can be applied to short sequences of complex and highly-dimensional images. The originality of our approach lies in the representations of the template and deformations, which are defined on a common triangulated domain, adapted to the geometry of the observed images. In this way, we have joint representations of the template and deformations which are compact and parsimonious. Using such representations, we are able to drastically reduce the number of parameters in the model. Besides, we adapt to our framework the Stochastic Approximation EM algorithm combined with a Markov Chain Monte Carlo procedure which was proposed in 2004 by Kuhn and Lavielle. Our implementation of this algorithm takes advantage of some properties which are specific to our framework. More precisely, we use the Markovian properties of deformations to build an efficient simulation strategy based on a Metropolis-Hasting-Within-Gibbs sampler. Finally, we present some experiments on sequences of medical images and synthetic data.  相似文献   

7.
In some situations, the distribution of the error terms of a multivariate linear regression model may depart from normality. This problem has been addressed, for example, by specifying a different parametric distribution family for the error terms, such as multivariate skewed and/or heavy-tailed distributions. A new solution is proposed, which is obtained by modelling the error term distribution through a finite mixture of multi-dimensional Gaussian components. The multivariate linear regression model is studied under this assumption. Identifiability conditions are proved and maximum likelihood estimation of the model parameters is performed using the EM algorithm. The number of mixture components is chosen through model selection criteria; when this number is equal to one, the proposal results in the classical approach. The performances of the proposed approach are evaluated through Monte Carlo experiments and compared to the ones of other approaches. In conclusion, the results obtained from the analysis of a real dataset are presented.  相似文献   

8.
We investigate estimation and testing procedures for the k-sample problem where each of the populations is subject to random truncation by possibly different but known truncation functions. Particular attention is focused on the two sample case which is motivated from the following important application. Neutrinos were detected from Supernova 1987A at two sites: the 1MB detector in Ohio (eight neutrinos observed) and the Kamiokande II detector in Japan (twelve observed). Each detector has different "trigger efficiencies", the chance of observing the flash of light produced by the neutrino knocking an electron loose from an atom. Thus, we have two independent samples of randomly truncated data. We assume a normal model for some power transformation of the data with the same power for each sample. We estimate the parameters of this distribution by maximum likelihood and find confidence regions for the parameters. A Monte Carlo study investigates the properties of the maximum likelihood estimators for this eutrino example.The simulations Show that approximate likelihood-based confidence regions provide coverages much closer to the nominal level than the regions based on asymptotic normal-theory.  相似文献   

9.
A new model combining parametric and semi-parametric approaches and following the lines of a semi-Markov model is developed for multi-stage processes. A Bivariate sojourn time distribution derived from the bivariate exponential distribution of Marshall & Olkin (1967) is adopted. The results compare favourably with the usual semi-parametric approaches that have been in use. Our approach also has several advantages over the models in use including its amenability to statistical inference. For example, the tests for symmetry and also for independence of the marginals of the sojourn time distributions, which were not available earlier, can now be conveniently derived and are enhanced in elegant forms. A unified Goodness-of-Fit test procedure for our proposed model is also presented. An application to the human resource planning involving real-life data from University of Nigeria is given.  相似文献   

10.
Folded normal distribution originates from the modulus of normal distribution. In the present article, we have formulated the cumulative distribution function (cdf) of a folded normal distribution in terms of standard normal cdf and the parameters of the mother normal distribution. Although cdf values of folded normal distribution were earlier tabulated in the literature, we have shown that those values are valid for very particular situations. We have also provided a simple approach to obtain values of the parameters of the mother normal distribution from those of the folded normal distribution. These results find ample application in practice, for example, in obtaining the so-called upper and lower α-points of folded normal distribution, which, in turn, is useful in testing of the hypothesis relating to folded normal distribution and in designing process capability control chart of some process capability indices. A thorough study has been made to compare the performance of the newly developed theory to the existing ones. Some simulated as well as real-life examples have been discussed to supplement the theory developed in this article. Codes (generated by R software) for the theory developed in this article are also presented for the ease of application.  相似文献   

11.
In this paper, measurements from experiments and results of a finite element analysis (FEA) are combined in order to compute accurate empirical models for the temperature distribution before a thermomechanically coupled forming process. To accomplish this, Design and Analysis of Computer Experiments (DACE) is used to separately compute models for the measurements and the functional output of the FEA. Based on a hierarchical approach, a combined model of the process is computed. In this combined modelling approach, the model for the FEA is corrected by taking into account the systematic deviations from the experimental measurements. The large number of observations based on the functional output hinders the direct computation of the DACE models due to the internal inversion of the correlation matrix. Thus, different techniques for identifying a relevant subset of the observations are proposed. The application of the resulting procedure is presented, and a statistical validation of the empirical models is performed.  相似文献   

12.
Abstract: The authors derive empirical likelihood confidence regions for the comparison distribution of two populations whose distributions are to be tested for equality using random samples. Another application they consider is to ROC curves, which are used to compare measurements of a diagnostic test from two populations. The authors investigate the smoothed empirical likelihood method for estimation in this context, and empirical likelihood based confidence intervals are obtained by means of the Wilks theorem. A bootstrap approach allows for the construction of confidence bands. The method is illustrated with data analysis and a simulation study.  相似文献   

13.
Measurement error, the difference between a measured (observed) value of quantity and its true value, is perceived as a possible source of estimation bias in many surveys. To correct for such bias, a validation sample can be used in addition to the original sample for adjustment of measurement error. Depending on the type of validation sample, we can either use the internal calibration approach or the external calibration approach. Motivated by Korean Longitudinal Study of Aging (KLoSA), we propose a novel application of fractional imputation to correct for measurement error in the analysis of survey data. The proposed method is to create imputed values of the unobserved true variables, which are mis-measured in the main study, by using validation subsample. Furthermore, the proposed method can be directly applicable when the measurement error model is a mixture distribution. Variance estimation using Taylor linearization is developed. Results from a limited simulation study are also presented.  相似文献   

14.
In this article, we present various distributional properties and application to reliability analysis of the Govindarajulu distribution. A quantile-based analysis is performed as the distribution function is not analytically tractable. The properties of the distribution like percentiles, L-moments, L-skewness, and kurtosis and order statistics are presented. Various reliability characteristics are derived along with some characterization theorems by relationship between reliability measures. We also make a comparative study with other competing models with reference to real data.  相似文献   

15.
We suggest pivotal methods for constructing simultaneous bootstrap confidence bands in regression. Most attention is given to the problem of simple linear regression, but our techniques admit trivial extension to other cases, including polynomial regression. The advantages of our bootstrap approach are twofold. Firstly, the bootstrap allows a very general distribution for the errors, and secondly, it admits a wide variety of shapes for the confidence band. In our technique the shape of each envelope of the band is determined by a general template, chosen by the experimenter, and bootstrap methods are used to select the scale of the template.  相似文献   

16.
Estimating the parameters of multivariate mixed Poisson models is an important problem in image processing applications, especially for active imaging or astronomy. The classical maximum likelihood approach cannot be used for these models since the corresponding masses cannot be expressed in a simple closed form. This paper studies a maximum pairwise likelihood approach to estimate the parameters of multivariate mixed Poisson models when the mixing distribution is a multivariate Gamma distribution. The consistency and asymptotic normality of this estimator are derived. Simulations conducted on synthetic data illustrate these results and show that the proposed estimator outperforms classical estimators based on the method of moments. An application to change detection in low-flux images is also investigated.  相似文献   

17.
This paper describes a technique for building compact models of the shape and appearance of flexible objects seen in two-dimensional images. The models are derived from the statistics of sets of images of example objects with ‘landmark’ points labelled on each object. Each model consists of a flexible shape template, describing how the landmark points can vary, and a statistical model of the expected grey levels in regions around each point. Such models have proved useful in a wide variety of applications. We describe how the models can be used in local image search and give examples of their application.  相似文献   

18.
This paper describes a technique for building compact models of the shape and appearance of flexible objects seen in two-dimensional images. The models are derived from the statistics of sets of images of example objects with 'landmark' points labelled on each object. Each model consists of a flexible shape template, describing how the landmark points can vary, and a statistical model of the expected grey levels in regions around each point. Such models have proved useful in a wide variety of applications. We describe how the models can be used in local image search and give examples of their application.  相似文献   

19.
SUMMARY This paper presents three methods for estimating Weibull distribution parameters for the case of irregular interval group failure data with unknown failure times. The methods are based on the concepts of the piecewise linear distribution function (PLDF), an average interval failure rate (AIFR) and sequential updating of the distribution function (SUDF), and use an analytical approach similar to that of Ackoff and Sasieni for regular interval group data. Results from a large number of simulated case problems generated with specified values of Weibull distribution parameters have been presented, which clearly indicate that the SUDF method produces near-perfect parameter estimates for all types of failure pattern. The performances of the PLDF and AIFR methods have been evaluated by goodness-of-fit testing and statistical confidence limits on the shape parameter. It has been found that, while the PLDF method produces acceptable parameter estimates, the AIFR method may fail for low and high shape parameter values that represent the cases of random and wear-out types of failure. A real-life application of the proposed methods is also presented, by analyzing failures of hydrogen make-up compressor valves in a petroleum refinery.  相似文献   

20.
Traditional techniques for calculating control limits for processes with discrete responses are based on the Poisson distribution. However, for many processes, the assumption of a Poisson distribution is violated. In such cases, use of traditional Poisson control limits may result in an inflated risk of Type I error. The negative binomial distribution is a natural extension of the Poisson distribution and allows for over‐dispersion relative to the Poisson distribution. A simple approach to calculating exact and approximate control limits for count data based on the negative binomial distribution is described. The approach is illustrated by application to water bacteria count data taken from a water purification system. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号