首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 718 毫秒
1.
Bayesian Multivariate Spatial Interpolation with Data Missing by Design   总被引:1,自引:0,他引:1  
In a network of s g sites, responses like levels of airborne pollutant concentrations may be monitored over time. The sites need not all measure the same set of response items and unmeasured items are considered as data missing by design . We propose a hierarchical Bayesian approach to interpolate the levels of, say, k responses at s u other locations called ungauged sites and also the unmeasured levels of the k responses at the gauged sites. Our method involves two steps. First, when all hyperparameters are assumed to be known, a predictive distribution is derived. In turn, an interpolator, its variance and a simultaneous interpolation region are obtained. In step two, we propose the use of an empirical Bayesian approach to estimate the hyperparameters through an EM algorithm. We base our theory on a linear Gaussian model and the relationship between a multivariate normal and matrix T -distribution. Our theory allows us to pool data from several existing networks that measure different subsets of response items for interpolation.  相似文献   

2.
Bayesian hierarchical models typically involve specifying prior distributions for one or more variance components. This is rather removed from the observed data, so specification based on expert knowledge can be difficult. While there are suggestions for “default” priors in the literature, often a conditionally conjugate inverse‐gamma specification is used, despite documented drawbacks of this choice. The authors suggest “conservative” prior distributions for variance components, which deliberately give more weight to smaller values. These are appropriate for investigators who are skeptical about the presence of variability in the second‐stage parameters (random effects) and want to particularly guard against inferring more structure than is really present. The suggested priors readily adapt to various hierarchical modelling settings, such as fitting smooth curves, modelling spatial variation and combining data from multiple sites.  相似文献   

3.
This article handles the prediction of hourly concentrations ofnon methane hydrocarbon (NMHC) pollutants at 15 unmonitored sites in Kuwait using the data recorded from 6 monitored stations at successive time points. The trend model depends on hourly meteorological variables and seasonal effects. The stochasticcomponent of the trend model which has spatiotemporal features is modeled as autoregressive temporal process. A spatial predictive distribution for residuals of the AR model is developed for the unmonitored sites. By transforming the predicted residuals back to the original data scales, we impute Kuwait’s hourly NMHC field.  相似文献   

4.
Recently, there has been great interest in estimating the decline in cognitive ability in patients with Alzheimer's disease. Measuring decline is not straightforward, since one must consider the choice of scale to measure cognitive ability, possible floor and ceiling effects, between-patient variability, and the unobserved age of onset. The authors demonstrate how to account for the above features by modeling decline in scores on the Mini-Mental State Exam in two different data sets. To this end, they use hierarchical Bayesian models with change points, for which posterior distributions are calculated using the Gibbs sampler. They make comparisons between several such models using both prior and posterior Bayes factors, and compare the results from the models suggested by these two model selection criteria.  相似文献   

5.
This article is concerned with modifications of both maximum likelihood and moment estimators for parameters of the three-parameter gamma distribution. Modifications employed here are essentially the same as those previously considered by the authors (1980, 1981) in connection with the lognormal distribution. Sampling behavior of the estimates is indicated by a Monte Carlo simulation. For certain combinations of parameter values, these new estimators appear better than both maximum likelihood and moment estimators with respect to bias, variance and/or ease of calculation.  相似文献   

6.
The paper describes Bayesian analysis for agricultural field experiments, a topic that has received very little previous attention, despite a vast frequentist literature. Adoption of the Bayesian paradigm simplifies the interpretation of the results, especially in ranking and selection. Also, complex formulations can be analysed with comparative ease, by using Markov chain Monte Carlo methods. A key ingredient in the approach is the need for spatial representations of the unobserved fertility patterns. This is discussed in detail. Problems caused by outliers and by jumps in fertility are tackled via hierarchical t formulations that may find use in other contexts. The paper includes three analyses of variety trials for yield and one example involving binary data; none is entirely straightforward. Some comparisons with frequentist analyses are made.  相似文献   

7.
Summary.  Motivated by the problem of predicting chemical deposition in eastern USA at weekly, seasonal and annual scales, the paper develops a framework for joint modelling of point- and grid-referenced spatiotemporal data in this context. The hierarchical model proposed can provide accurate spatial interpolation and temporal aggregation by combining information from observed point-referenced monitoring data and gridded output from a numerical simulation model known as the 'community multi-scale air quality model'. The technique avoids the change-of-support problem which arises in other hierarchical models for data fusion settings to combine point- and grid-referenced data. The hierarchical space–time model is fitted to weekly wet sulphate and nitrate deposition data over eastern USA. The model is validated with set-aside data from a number of monitoring sites. Predictive Bayesian methods are developed and illustrated for inference on aggregated summaries such as quarterly and annual sulphate and nitrate deposition maps. The highest wet sulphate deposition occurs near major emissions sources such as fossil-fuelled power plants whereas lower values occur near background monitoring sites.  相似文献   

8.
A two-stage hierarchical model for analysis of discrete data with extra-Poisson variation is examined. The model consists of a Poisson distribution with a mixing lognormal distribution for the mean. A method of approximate maximum likelihood estimation of the parameters is proposed. The method uses the EM algorithm and approximations to facilitate its implementation are derived. Approximate standard errors of the estimates are provided and a numerical example is used to illustrate the method.  相似文献   

9.
A new parametric (three-parameter) survival distribution, the lognormal–power function distribution, with flexible behaviour is introduced. Its hazard rate function can be either unimodal, monotonically decreasing or can exhibit a bathtub shape. Special cases include the lognormal distribution and the power function distribution, with finite support. Regions of parameter space where the various forms of the hazard-rate function prevail are established analytically. The distribution lends itself readily to accelerated life regression modelling. Applications to five data sets taken from the literature are given. Also it is shown how the distribution can behave like a Weibull distribution (with negative aging) for certain parameter values.  相似文献   

10.
The lognormal distribution is currently used extensively to describe the distribution of positive random variables. This is especially the case with data pertaining to occupational health and other biological data. One particular application of the data is statistical inference with regards to the mean of the data. Other authors, namely Zou et al. (2009), have proposed procedures involving the so-called “method of variance estimates recovery” (MOVER), while an alternative approach based on simulation is the so-called generalized confidence interval, discussed by Krishnamoorthy and Mathew (2003). In this paper we compare the performance of the MOVER-based confidence interval estimates and the generalized confidence interval procedure to coverage of credibility intervals obtained using Bayesian methodology using a variety of different prior distributions to estimate the appropriateness of each. An extensive simulation study is conducted to evaluate the coverage accuracy and interval width of the proposed methods. For the Bayesian approach both the equal-tail and highest posterior density (HPD) credibility intervals are presented. Various prior distributions (Independence Jeffreys' prior, Jeffreys'-Rule prior, namely, the square root of the determinant of the Fisher Information matrix, reference and probability-matching priors) are evaluated and compared to determine which give the best coverage with the most efficient interval width. The simulation studies show that the constructed Bayesian confidence intervals have satisfying coverage probabilities and in some cases outperform the MOVER and generalized confidence interval results. The Bayesian inference procedures (hypothesis tests and confidence intervals) are also extended to the difference between two lognormal means as well as to the case of zero-valued observations and confidence intervals for the lognormal variance. In the last section of this paper the bivariate lognormal distribution is discussed and Bayesian confidence intervals are obtained for the difference between two correlated lognormal means as well as for the ratio of lognormal variances, using nine different priors.  相似文献   

11.
In this article, the proportional hazard model with Weibull frailty, which is outside the range of the exponential family, is used for analysing the right-censored longitudinal survival data. Complex multidimensional integrals are avoided by using hierarchical likelihood to estimate the regression parameters and to predict the realizations of random effects. The adjusted profile hierarchical likelihood is adopted to estimate the parameters in frailty distribution, during which the first- and second-order methods are used. The simulation studies indicate that the regression-parameter estimates in the Weibull frailty model are accurate, which is similar to the gamma frailty and lognormal frailty models. Two published data sets are used for illustration.  相似文献   

12.
Abstract.  This paper considers the problem of mapping spatial variation of yield in a field using data from a yield monitoring system on a combine harvester. The unobserved yield is assumed to be a Gaussian random field and the yield monitoring system data is modelled as a convolution of the yield and an impulse response function. This results in an unusual spatial covariance structure (depending on the driving pattern of the combine harvester) for the yield monitoring system data. Parameters of the impulse response function and the spatial covariance function of the yield are estimated using maximum likelihood methods. The fitted model is assessed using certain empirical directional covariograms and the yield is finally predicted using the inferred statistical model.  相似文献   

13.
On Block Updating in Markov Random Field Models for Disease Mapping   总被引:3,自引:0,他引:3  
Gaussian Markov random field (GMRF) models are commonly used to model spatial correlation in disease mapping applications. For Bayesian inference by MCMC, so far mainly single-site updating algorithms have been considered. However, convergence and mixing properties of such algorithms can be extremely poor due to strong dependencies of parameters in the posterior distribution. In this paper, we propose various block sampling algorithms in order to improve the MCMC performance. The methodology is rather general, allows for non-standard full conditionals, and can be applied in a modular fashion in a large number of different scenarios. For illustration we consider three different applications: two formulations for spatial modelling of a single disease (with and without additional unstructured parameters respectively), and one formulation for the joint analysis of two diseases. The results indicate that the largest benefits are obtained if parameters and the corresponding hyperparameter are updated jointly in one large block. Implementation of such block algorithms is relatively easy using methods for fast sampling of Gaussian Markov random fields ( Rue, 2001 ). By comparison, Monte Carlo estimates based on single-site updating can be rather misleading, even for very long runs. Our results may have wider relevance for efficient MCMC simulation in hierarchical models with Markov random field components.  相似文献   

14.
In many engineering problems it is necessary to draw statistical inferences on the mean of a lognormal distribution based on a complete sample of observations. Statistical demonstration of mean time to repair (MTTR) is one example. Although optimum confidence intervals and hypothesis tests for the lognormal mean have been developed, they are difficult to use, requiring extensive tables and/or a computer. In this paper, simplified conservative methods for calculating confidence intervals or hypothesis tests for the lognormal mean are presented. In this paper, “conservative” refers to confidence intervals (hypothesis tests) whose infimum coverage probability (supremum probability of rejecting the null hypothesis taken over parameter values under the null hypothesis) equals the nominal level. The term “conservative” has obvious implications to confidence intervals (they are “wider” in some sense than their optimum or exact counterparts). Applying the term “conservative” to hypothesis tests should not be confusing if it is remembered that this implies that their equivalent confidence intervals are conservative. No implication of optimality is intended for these conservative procedures. It is emphasized that these are direct statistical inference methods for the lognormal mean, as opposed to the already well-known methods for the parameters of the underlying normal distribution. The method currently employed in MIL-STD-471A for statistical demonstration of MTTR is analyzed and compared to the new method in terms of asymptotic relative efficiency. The new methods are also compared to the optimum methods derived by Land (1971, 1973).  相似文献   

15.
16.
The Black Scholes formula has been widely used to price financial instruments. The derivation of this formula is based on the assumption of lognormally distributed returns which is often in poor agreement with actual data. An option pricing formula based on the generalized beta of the second kind (GB2) is presented. This formula includes the Black Scholes formula as a special case and accommodates a wide variety of nonlognormally distributed returns. The sensitivity of option values to departures from the skewness and kurtosis associated with the lognormal distribution is investigated.  相似文献   

17.
The problem of goodness of fit of a lognormal distribution is usually reduced to testing goodness of fit of the logarithmic data to a normal distribution. In this paper, new goodness-of-fit tests for a lognormal distribution are proposed. The new procedures make use of a characterization property of the lognormal distribution which states that the Kullback–Leibler measure of divergence between a probability density function (p.d.f) and its r-size weighted p.d.f is symmetric only for the lognormal distribution [Tzavelas G, Economou P. Characterization properties of the log-normal distribution obtained with the help of divergence measures. Stat Probab Lett. 2012;82(10):1837–1840]. A simulation study examines the performance of the new procedures in comparison with existing goodness-of-fit tests for the lognormal distribution. Finally, two well-known data sets are used to illustrate the methods developed.  相似文献   

18.
Generalized linear mixed models are widely used for describing overdispersed and correlated data. Such data arise frequently in studies involving clustered and hierarchical designs. A more flexible class of models has been developed here through the Dirichlet process mixture. An additional advantage of using such mixture models is that the observations can be grouped together on the basis of the overdispersion present in the data. This paper proposes a partial empirical Bayes method for estimating all the model parameters by adopting a version of the EM algorithm. An augmented model that helps to implement an efficient Gibbs sampling scheme, under the non‐conjugate Dirichlet process generalized linear model, generates observations from the conditional predictive distribution of unobserved random effects and provides an estimate of the average number of mixing components in the Dirichlet process mixture. A simulation study has been carried out to demonstrate the consistency of the proposed method. The approach is also applied to a study on outdoor bacteria concentration in the air and to data from 14 retrospective lung‐cancer studies.  相似文献   

19.
We develop and apply an approach to the spatial interpolation of a vector-valued random response field. The Bayesian approach we adopt enables uncertainty about the underlying models to be représentés in expressing the accuracy of the resulting interpolants. The methodology is particularly relevant in environmetrics, where vector-valued responses are only observed at designated sites at successive time points. The theory allows space-time modelling at the second level of the hierarchical prior model so that uncertainty about the model parameters has been fully expressed at the first level. In this way, we avoid unduly optimistic estimates of inferential accuracy. Moreover, the prior model can be upgraded with any available new data, while past data can be used in a systematic way to fit model parameters. The theory is based on the multivariate normal and related joint distributions. Our hierarchical prior models lead to posterior distributions which are robust with respect to the choice of the prior (hyperparameters). We illustrate our theory with an example involving monitoring stations in southern Ontario, where monthly average levels of ozone, sulphate, and nitrate are available and between-station response triplets are interpolated. In this example we use a recently developed method for interpolating spatial correlation fields.  相似文献   

20.
ABSTRACT

The characteristic function of the lognormal distribution is of interest in a number of scientific fields yet an analytic solution remains elusive, making reliable and efficient approximations necessary. In this article, we build on the results of N. C. Beaulieu and A. Saberali in ‘New approximations to the lognormal characteristic function’, by introducing a Taylor- and Bessel function-based partial expansion of the integrand and a Chebyshev quadrature approach. Through computer simulations we show that the Taylor expansion remains accurate and efficient for all commonly computed values, and specify the range of values for which the other two approaches show a significantly stronger performance.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号