首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper describes a technique for building compact models of the shape and appearance of flexible objects seen in two-dimensional images. The models are derived from the statistics of sets of images of example objects with 'landmark' points labelled on each object. Each model consists of a flexible shape template, describing how the landmark points can vary, and a statistical model of the expected grey levels in regions around each point. Such models have proved useful in a wide variety of applications. We describe how the models can be used in local image search and give examples of their application.  相似文献   

2.
3.
This paper describes methodology for carrying out local transformations of a set of landmarkpoints describingfacialfeatures, so that they provide a good match to a given set of facial landmarkpoints: the landmark points are divided into groups of 'similar 'points that take facial symmetry into account, and the coordinate values of the points within each group are differentially stretched and contracted. The methodology is illustrated by application to sets of 195 facial landmark points.  相似文献   

4.
Data for studies of biological shape often consist of the locations of individually named pointslandmarks considered to be homologous' (to correspond biologically) from form to form. In 1917 D'Arcy Thompson introduced an elegant model of homology as deformation: the configuration of landmark locations for any one form is viewed as a finite sample from a smooth mapping representing its biological relationship to any other form of the data set. For data in two dimensions, multivariate statistical analysis of landmark locations may proceed unambiguously in terms of complex-valued shape coordinates (e,v) = (C?A)/(B?A) for sets of landmark triangles ABC. These are the coordinates of one vertex/landmark after scaling so that the remaining two vertices are at (0,0) and (1,0). Expressed in this fashion, the biological interpretation of the statistical analysis as a homology mapping would appear to depend on the triangulation. This paper introduces an analysis of landmark data and homology mappings using a hierarchy of geometric components of shape difference or shape change. Each component is a smooth deformation taking the form of a bivariate polynomial in the shape coordinates and is estimated in a manner nearly invariant with respect to the choice of a triangulation.  相似文献   

5.
One of the important topics in morphometry that received high attention recently is the longitudinal analysis of shape variation. According to Kendall's definition of shape, the shape of object appertains on non-Euclidean space, making the longitudinal study of configuration somehow difficult. However, to simplify this task, triangulation of the objects and then constructing a non-parametric regression-type model on the unit sphere is pursued in this paper. The prediction of the configurations in some time instances is done using both properties of triangulation and the size of great baselines. Moreover, minimizing a Euclidean risk function is proposed to select feasible weights in constructing smoother functions in a non-parametric smoothing manner. These will provide some proper shape growth models to analysis objects varying in time. The proposed models are applied to analysis of two real-life data sets.  相似文献   

6.
Hidden Markov models form an extension of mixture models which provides a flexible class of models exhibiting dependence and a possibly large degree of variability. We show how reversible jump Markov chain Monte Carlo techniques can be used to estimate the parameters as well as the number of components of a hidden Markov model in a Bayesian framework. We employ a mixture of zero-mean normal distributions as our main example and apply this model to three sets of data from finance, meteorology and geomagnetism.  相似文献   

7.
Survival models have been extensively used to analyse time-until-event data. There is a range of extended models that incorporate different aspects, such as overdispersion/frailty, mixtures, and flexible response functions through semi-parametric models. In this work, we show how a useful tool to assess goodness-of-fit, the half-normal plot of residuals with a simulated envelope, implemented in the hnp package in R, can be used on a location-scale modelling context. We fitted a range of survival models to time-until-event data, where the event was an insect predator attacking a larva in a biological control experiment. We started with the Weibull model and then fitted the exponentiated-Weibull location-scale model with regressors both for the location and scale parameters. We performed variable selection for each model and, by producing half-normal plots with simulated envelopes for the deviance residuals of the model fits, we found that the exponentiated-Weibull fitted the data better. We then included a random effect in the exponentiated-Weibull model to accommodate correlated observations. Finally, we discuss possible implications of the results found in the case study.  相似文献   

8.
9.
We examine the use of Confocal Laser Tomographic images for detecting glaucoma. From the clinical aspect, the optic nerve head's (ONH) area contains all the relevant information on glaucoma. The shape of ONH is approximately a skewed cup. We summarize its shape by three biological landmarks on the neural-rim and the fourth landmark as the point of the maximum depth, which is approximately the point where the optic nerve enters this eye cup. These four landmarks are extracted from the images related to some Rhesus monkeys before and after inducing glaucoma. Previous analysis on Bookstein shape coordinates of these four landmarks revealed only marginally significant findings. From clinical experience, it is believed that the ratio depth to diameter of the eye cup provides a useful measure of the shape change. We consider the bootstrap distribution of this normalized 'depth' (G) and give evidence that it provides an appropriate measure of the shape change. This measure G is labelled as the glaucoma index. Further experiments are in progress to validate its use for glaucoma in humans.  相似文献   

10.
One method of expressing coarse information about the shape of an object is to describe the shape by its landmarks, which can be taken as meaningful points on the outline of an object. We consider a situation in which we want to classify shapes into known populations based on their landmarks, invariant to the location, scale and rotation of the shapes. A neural network method for transformation-invariant classification of landmark data is presented. The method is compared with the (non-transformation-invariant) complex Bingham rule; the two techniques are tested on two sets of simulated data, and on data that arise from mice vertebrae. Despite the obvious advantage of the complex Bingham rule because of information about rotation, the neural network method compares favourably.  相似文献   

11.
We introduce a new flexible distribution to deal with variables on the unit interval based on a transformation of the sinh–arcsinh distribution, which accommodates different degrees of skewness and kurtosis and becomes an interesting alternative to model this type of data. We also include this new distribution into the generalised additive models for location, scale and shape (GAMLSS) framework in order to develop and fit its regression model. For different parameter settings, some simulations are performed to investigate the behaviour of the estimators. The potentiality of the new regression model is illustrated by means of a real dataset related to the points rate of football teams at the end of a championship from the four most important leagues in the world: Barclays Premier League (England), Bundesliga (Germany), Serie A (Italy) and BBVA league (Spain) during three seasons (2011–2012, 2012–2013 and 2013–2014).  相似文献   

12.
Previous approaches to model-based object recognition from images have required information about the position and orientation (pose) of the object relative to the camera. Object recognition has been carried out in parallel with pose estimation, leading to algorithm which are error prone and computationally expensive. Object recognition can be decoupled from pose estimation by using geometrical properties of the object that are unchanged or invariant under projection to the image. Thus, the computational cost of object recognition is drastically reduced. A number of invariants of current interest in computer Interest in computer vision are described. The simplest and most fundamental projective invariant, the cross ratio of four collinear points, is then investigated in detail. A simple system is defined for recognizing objects on the basis of the cross ratio alone. The system has a database of models. Each model is a single cross ratio value. The performance of the system is characterized by the probability R of rejection, the probability po(s) of misclassification and the probability F of a false alarm. Formulae for R, po(g) and Fare stated. The probability density function p(t) for the cross ratio t of four collinear points with independent, identical Gaussian distribution is stated. Experiments have been carried out to see how well the formulae for R, F and p(t) apply in practice. The results are extremely encouraging. The cumulative distribution function for p(r) is closely matched by the cumulative distribution function estimated from natural images. The experimental estimates of R agree well with the theoretical predictions. However, the experimental estimates of F are below the theoretical predictions. Two possible reasons for the discrepancy are suggested: (1) it is due to the finite resolution of the corner detector; and (2) it is due to deviations from the Gaussian distributions assumed in the theoretical calculations. The experimental investigation of R has led to a new, simple and theoretically well-founded way of estimating the accuracy of corner detectors.  相似文献   

13.
Summary.  Asymmetry is a feature of shape which is of particular interest in a variety of applications. With landmark data, the essential information on asymmetry is contained in the degree to which there is a mismatch between a landmark configuration and its relabelled and matched reflection. This idea is explored in the context of a study of facial shape in infants, where particular interest lies in identifying changes over time and in assessing residual deformity in children who have had corrective surgery for a cleft lip or cleft lip and palate. Interest lies not in whether the mean shape is asymmetric but in comparing the degrees of asymmetry in different populations. A decomposition of the asymmetry score into components that are attributable to particular features of the face is proposed. A further decomposition allows different sources of asymmetry due to position, orientation or intrinsic asymmetry to be identified for each feature. The methods are also extended to data representing anatomical curves across the face.  相似文献   

14.
It is also shown that our proposed skew-normal model subsumes many other well-known skew-normal model that exists in the literature. Recent work on a new two-parameter generalized skew-normal model has received a lot of attention. This paper presents a new generalized Balakrishnan type skew–normal distribution by introducing two shape parameters. We also provide some useful results for this new generalization. It is also shown that our proposed skew–normal model subsumes the original Balakrishnan skew–normal model (2002) as well as other well–known skew–normal models as special cases. The resulting flexible model can be expected to fit a wider variety of data structures than either of the models involving a single skewing mechanism. For illustrative purposes, a famed data set on IQ scores has been used to exhibit the efficacy of the proposed model.  相似文献   

15.
16.
Using a Yamaguchi‐type generalized gamma failure‐time mixture model, we analyse the data from a study of autologous and allogeneic bone marrow transplantation in the treatment of high‐risk refractory acute lymphoblastic leukaemia, focusing on the time to recurrence of disease. We develop maximum likelihood techniques for the joint estimation of the surviving fractions and the survivor functions. This includes an approximation to the derivative of the survivor function with respect to the shape parameter. We obtain the maximum likelihood estimates of the model parameters. We also compute the variance‐covariance matrix of the parameter estimators. The extended family of generalized gamma failure‐time mixture models is flexible enough to include many commonly used failure‐time distributions as special cases. Yet these models are not used in practice because of computational difficulties. We claim that we have overcome this problem. The proposed approximation to the derivative of the survivor function with respect to the shape parameter can be used in any statistical package. We also address the issue of lack of identifiability. We point out that there can be a substantial advantage to using the gamma failure‐time mixture models over nonparametric methods. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   

17.
ABSTRACT

We propose models that allow us to capture the evolution of objects over time and more importantly, we provide forecasts to describe an object at future unobserved states utilizing information from the current state along with covariate information. We view objects as random sets and proceed to model them in a hierarchical Bayesian framework and estimate the model parameters using a Markov chain Monte Carlo scheme. We illustrate the methodology with an application to nowcasting of severe weather precipitation fields as obtained from weather radar images, where the severe storm cells are treated as random sets and the wind velocity is used to inform the distributions of the model parameters.  相似文献   

18.
Longitudinal data frequently arises in various fields of applied sciences where individuals are measured according to some ordered variable, e.g. time. A common approach used to model such data is based on the mixed models for repeated measures. This model provides an eminently flexible approach to modeling of a wide range of mean and covariance structures. However, such models are forced into a rigidly defined class of mathematical formulas which may not be well supported by the data within the whole sequence of observations. A possible non-parametric alternative is a cubic smoothing spline, which is highly flexible and has useful smoothing properties. It can be shown that under normality assumption, the solution of the penalized log-likelihood equation is the cubic smoothing spline, and this solution can be further expressed as a solution of the linear mixed model. It is shown here how cubic smoothing splines can be easily used in the analysis of complete and balanced data. Analysis can be greatly simplified by using the unweighted estimator studied in the paper. It is shown that if the covariance structure of random errors belong to certain class of matrices, the unweighted estimator is the solution to the penalized log-likelihood function. This result is new in smoothing spline context and it is not only confined to growth curve settings. The connection to mixed models is used in developing a rough testing of group profiles. Numerical examples are presented to illustrate the techniques proposed.  相似文献   

19.
We propose a flexible method to approximate the subjective cumulative distribution function of an economic agent about the future realization of a continuous random variable. The method can closely approximate a wide variety of distributions while maintaining weak assumptions on the shape of distribution functions. We show how moments and quantiles of general functions of the random variable can be computed analytically and/or numerically. We illustrate the method by revisiting the determinants of income expectations in the United States. A Monte Carlo analysis suggests that a quantile-based flexible approach can be used to successfully deal with censoring and possible rounding levels present in the data. Finally, our analysis suggests that the performance of our flexible approach matches that of a correctly specified parametric approach and is clearly better than that of a misspecified parametric approach.  相似文献   

20.
Preference decisions will usually depend on the characteristics of both the judges and the objects being judged. In the analysis of paired comparison data concerning European universities and students' characteristics, it is demonstrated how to incorporate subject-specific information into Bradley–Terry-type models. Using this information it is shown that preferences for universities and therefore university rankings are dramatically different for different groups of students. A log-linear representation of a generalized Bradley–Terry model is specified which allows simultaneous modelling of subject- and object-specific covariates and interactions between them. A further advantage of this approach is that standard software for fitting log-linear models, such as GLIM, can be used.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号