首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
贝叶斯统计推断及其主要进展   总被引:1,自引:0,他引:1  
贝叶斯统计推断作为现代统计分析方法的重要内容,对于统计学理论的发展具有里程碑的作用。深入总结其研究的主要进展,具有重要的现实意义。在查阅国内外重要学术研究资料的基础上,从贝叶斯统计推断的思想、与古典统计的研究思路比较和贝叶斯统计推断研究的主要进展三个方面作了综述与介绍,力图达到认识贝叶斯统计推断及其研究现状的目的。  相似文献   

2.
Bayesian statistical inference relies on the posterior distribution. Depending on the model, the posterior can be more or less difficult to derive. In recent years, there has been a lot of interest in complex settings where the likelihood is analytically intractable. In such situations, approximate Bayesian computation (ABC) provides an attractive way of carrying out Bayesian inference. For obtaining reliable posterior estimates however, it is important to keep the approximation errors small in ABC. The choice of an appropriate set of summary statistics plays a crucial role in this effort. Here, we report the development of a new algorithm that is based on least angle regression for choosing summary statistics. In two population genetic examples, the performance of the new algorithm is better than a previously proposed approach that uses partial least squares.  相似文献   

3.
We consider the context of probabilistic inference of model parameters given error bars or confidence intervals on model output values, when the data is unavailable. We introduce a class of algorithms in a Bayesian framework, relying on maximum entropy arguments and approximate Bayesian computation methods, to generate consistent data with the given summary statistics. Once we obtain consistent data sets, we pool the respective posteriors, to arrive at a single, averaged density on the parameters. This approach allows us to perform accurate forward uncertainty propagation consistent with the reported statistics.  相似文献   

4.
Abstract. Deterministic Bayesian inference for latent Gaussian models has recently become available using integrated nested Laplace approximations (INLA). Applying the INLA‐methodology, marginal estimates for elements of the latent field can be computed efficiently, providing relevant summary statistics like posterior means, variances and pointwise credible intervals. In this article, we extend the use of INLA to joint inference and present an algorithm to derive analytical simultaneous credible bands for subsets of the latent field. The algorithm is based on approximating the joint distribution of the subsets by multivariate Gaussian mixtures. Additionally, we present a saddlepoint approximation to compute Bayesian contour probabilities, representing the posterior support of fixed parameter vectors of interest. We perform a simulation study and apply the given methods to two real examples.  相似文献   

5.
This article mainly considers interval estimation of the scale and shape parameters of the generalized exponential (GE) distribution. We adopt the generalized fiducial method to construct a kind of new confidence intervals for the parameters of interest and compare them with the frequentist and Bayesian methods. In addition, we give the comparison of the point estimation based on the frequentist, generalized fiducial and Bayesian methods. Simulation results show that a new procedure based on generalized fiducial inference is more applicable than the non-fiducial methods for the point and interval estimation of the GE distribution. Finally, two lifetime data sets are used to illustrate the application of our new procedure.  相似文献   

6.
Synthetic likelihood is an attractive approach to likelihood-free inference when an approximately Gaussian summary statistic for the data, informative for inference about the parameters, is available. The synthetic likelihood method derives an approximate likelihood function from a plug-in normal density estimate for the summary statistic, with plug-in mean and covariance matrix obtained by Monte Carlo simulation from the model. In this article, we develop alternatives to Markov chain Monte Carlo implementations of Bayesian synthetic likelihoods with reduced computational overheads. Our approach uses stochastic gradient variational inference methods for posterior approximation in the synthetic likelihood context, employing unbiased estimates of the log likelihood. We compare the new method with a related likelihood-free variational inference technique in the literature, while at the same time improving the implementation of that approach in a number of ways. These new algorithms are feasible to implement in situations which are challenging for conventional approximate Bayesian computation methods, in terms of the dimensionality of the parameter and summary statistic.  相似文献   

7.
For nearly any challenging scientific problem evaluation of the likelihood is problematic if not impossible. Approximate Bayesian computation (ABC) allows us to employ the whole Bayesian formalism to problems where we can use simulations from a model, but cannot evaluate the likelihood directly. When summary statistics of real and simulated data are compared??rather than the data directly??information is lost, unless the summary statistics are sufficient. Sufficient statistics are, however, not common but without them statistical inference in ABC inferences are to be considered with caution. Previously other authors have attempted to combine different statistics in order to construct (approximately) sufficient statistics using search and information heuristics. Here we employ an information-theoretical framework that can be used to construct appropriate (approximately sufficient) statistics by combining different statistics until the loss of information is minimized. We start from a potentially large number of different statistics and choose the smallest set that captures (nearly) the same information as the complete set. We then demonstrate that such sets of statistics can be constructed for both parameter estimation and model selection problems, and we apply our approach to a range of illustrative and real-world model selection problems.  相似文献   

8.
Modelling of HIV dynamics in AIDS research has greatly improved our understanding of the pathogenesis of HIV-1 infection and guided for the treatment of AIDS patients and evaluation of antiretroviral therapies. Some of the model parameters may have practical meanings with prior knowledge available, but others might not have prior knowledge. Incorporating priors can improve the statistical inference. Although there have been extensive Bayesian and frequentist estimation methods for the viral dynamic models, little work has been done on making simultaneous inference about the Bayesian and frequentist parameters. In this article, we propose a hybrid Bayesian inference approach for viral dynamic nonlinear mixed-effects models using the Bayesian frequentist hybrid theory developed in Yuan [Bayesian frequentist hybrid inference, Ann. Statist. 37 (2009), pp. 2458–2501]. Compared with frequentist inference in a real example and two simulation examples, the hybrid Bayesian approach is able to improve the inference accuracy without compromising the computational load.  相似文献   

9.

Approximate Bayesian computation (ABC) has become one of the major tools of likelihood-free statistical inference in complex mathematical models. Simultaneously, stochastic differential equations (SDEs) have developed to an established tool for modelling time-dependent, real-world phenomena with underlying random effects. When applying ABC to stochastic models, two major difficulties arise: First, the derivation of effective summary statistics and proper distances is particularly challenging, since simulations from the stochastic process under the same parameter configuration result in different trajectories. Second, exact simulation schemes to generate trajectories from the stochastic model are rarely available, requiring the derivation of suitable numerical methods for the synthetic data generation. To obtain summaries that are less sensitive to the intrinsic stochasticity of the model, we propose to build up the statistical method (e.g. the choice of the summary statistics) on the underlying structural properties of the model. Here, we focus on the existence of an invariant measure and we map the data to their estimated invariant density and invariant spectral density. Then, to ensure that these model properties are kept in the synthetic data generation, we adopt measure-preserving numerical splitting schemes. The derived property-based and measure-preserving ABC method is illustrated on the broad class of partially observed Hamiltonian type SDEs, both with simulated data and with real electroencephalography data. The derived summaries are particularly robust to the model simulation, and this fact, combined with the proposed reliable numerical scheme, yields accurate ABC inference. In contrast, the inference returned using standard numerical methods (Euler–Maruyama discretisation) fails. The proposed ingredients can be incorporated into any type of ABC algorithm and directly applied to all SDEs that are characterised by an invariant distribution and for which a measure-preserving numerical method can be derived.

  相似文献   

10.
Both approximate Bayesian computation (ABC) and composite likelihood methods are useful for Bayesian and frequentist inference, respectively, when the likelihood function is intractable. We propose to use composite likelihood score functions as summary statistics in ABC in order to obtain accurate approximations to the posterior distribution. This is motivated by the use of the score function of the full likelihood, and extended to general unbiased estimating functions in complex models. Moreover, we show that if the composite score is suitably standardised, the resulting ABC procedure is invariant to reparameterisations and automatically adjusts the curvature of the composite likelihood, and of the corresponding posterior distribution. The method is illustrated through examples with simulated data, and an application to modelling of spatial extreme rainfall data is discussed.  相似文献   

11.
The likelihood function is often used for parameter estimation. Its use, however, may cause difficulties in specific situations. In order to circumvent these difficulties, we propose a parameter estimation method based on the replacement of the likelihood in the formula of the Bayesian posterior distribution by a function which depends on a contrast measuring the discrepancy between observed data and a parametric model. The properties of the contrast-based (CB) posterior distribution are studied to understand what the consequences of incorporating a contrast in the Bayes formula are. We show that the CB-posterior distribution can be used to make frequentist inference and to assess the asymptotic variance matrix of the estimator with limited analytical calculations compared to the classical contrast approach. Even if the primary focus of this paper is on frequentist estimation, it is shown that for specific contrasts the CB-posterior distribution can be used to make inference in the Bayesian way.The method was used to estimate the parameters of a variogram (simulated data), a Markovian model (simulated data) and a cylinder-based autosimilar model describing soil roughness (real data). Even if the method is presented in the spatial statistics perspective, it can be applied to non-spatial data.  相似文献   

12.
Geometric Anisotropic Spatial Point Pattern Analysis and Cox Processes   总被引:1,自引:0,他引:1  
We consider spatial point processes with a pair correlation function, which depends only on the lag vector between a pair of points. Our interest is in statistical models with a special kind of ‘structured’ anisotropy: the pair correlation function is geometric anisotropic if it is elliptical but not spherical. In particular, we study Cox process models with an elliptical pair correlation function, including shot noise Cox processes and log Gaussian Cox processes, and we develop estimation procedures using summary statistics and Bayesian methods. Our methodology is illustrated on real and synthetic datasets of spatial point patterns.  相似文献   

13.
In this paper, we consider the statistical inference for the success probability in the case of start-up demonstration tests in which rejection of units is possible when a pre-fixed number of failures is observed before the required number of consecutive successes are achieved for acceptance of the unit. Since the expected value of the stopping time is not a monotone function of the unknown parameter, the method of moments is not useful in this situation. Therefore, we discuss two estimation methods for the success probability: (1) the maximum likelihood estimation (MLE) via the expectation-maximization (EM) algorithm and (2) Bayesian estimation with a beta prior. We examine the small-sample properties of the MLE and Bayesian estimator. Finally, we present an example to illustrate the method of inference discussed here.  相似文献   

14.
空间统计学是研究空间问题的一门学科,它是应用数学快速发展的一个分支。尽管传统的数据分析中有许多很好的方法,但却不能完全地套用于空间数据的分析。空间模型的估计不仅与各种回归形式的假设有关,而且还与空间相关、空间异质的特性有关。从空间模型及推断、适应性估计、非参数回归、空间不相关性检验几个方面研究了空间数据分析方法的发展以及未来的趋势。  相似文献   

15.
In this article, we investigate various properties and methods of estimation of the Weighted Exponential distribution. Although, our main focus is on estimation (from both frequentist and Bayesian point of view) yet, the stochastic ordering, the Bonferroni and the Lorenz curves, various entropies and order statistics are derived first time for the said distribution. Different types of loss functions are considered for Bayesian estimation. Furthermore, the Bayes estimators and their respective posterior risks are computed and compared using Gibbs sampling. The different reliability characteristics including hazard function, stress and strength analysis, and mean residual life function are also derived. Monte Carlo simulations are performed to compare the performances of the proposed methods of estimation and two real data sets have been analysed for illustrative purposes.  相似文献   

16.
The Bayesian estimation and prediction problems for the linear hazard rate distribution under general progressively Type-II censored samples are considered in this article. The conventional Bayesian framework as well as the Markov Chain Monte Carlo (MCMC) method to generate the Bayesian conditional probabilities of interest are discussed. Sensitivity of the prior for the model is also examined. The flood data on Fox River, Wisconsin, from 1918 to 1950, are used to illustrate all the methods of inference discussed in this article.  相似文献   

17.
Multiple-membership logit models with random effects are models for clustered binary data, where each statistical unit can belong to more than one group. The likelihood function of these models is analytically intractable. We propose two different approaches for parameter estimation: indirect inference and data cloning (DC). The former is a non-likelihood-based method which uses an auxiliary model to select reasonable estimates. We propose an auxiliary model with the same dimension of parameter space as the target model, which is particularly convenient to reach good estimates very fast. The latter method computes maximum likelihood estimates through the posterior distribution of an adequate Bayesian model, fitted to cloned data. We implement a DC algorithm specifically for multiple-membership models. A Monte Carlo experiment compares the two methods on simulated data. For further comparison, we also report Bayesian posterior mean and Integrated Nested Laplace Approximation hybrid DC estimates. Simulations show a negligible loss of efficiency for the indirect inference estimator, compensated by a relevant computational gain. The approaches are then illustrated with two real examples on matched paired data.  相似文献   

18.
In the case of exponential families, it is a straightforward matter to approximate a density function by use of summary statistics; however, an appropriate approach to such approximation is far less clear when an exponential family is not assumed. In this paper, a maximin argument based on information theory is used to derive a new approach to density approximation from summary statistics which is not restricted by the assumption of validity of an underlying exponential family. Information-theoretic criteria are developed to assess loss of predictive power of summary statistics under such minimal knowledge. Under these criteria, optimal density approximations in the maximin sense are obtained and shown to be related to exponential families. Conditions for existence of optimal density approximations are developed. Applications of the proposed approach are illustrated, and methods for estimation of densities are provided in the case of simple random sampling. Large-sample theory for estimates is developed.  相似文献   

19.
We consider the problem of deriving Bayesian inference procedures via the concept of relative surprise. The mathematical concept of surprise has been developed by I.J. Good in a long sequence of papers. We make a modification to this development that permits the avoidance of a serious defect; namely, the change of variable problem. We apply relative surprise to the development of estimation, hypothesis testing and model checking procedures. Important advantages of the relative surprise approach to inference include the lack of dependence on a particular loss function and complete freedom to the statistician in the choice of prior for hypothesis testing problems. Links are established with common Bayesian inference procedures such as highest posterior density regions, modal estimates and Bayes factors. From a practical perspective new inference procedures arise that possess good properties.  相似文献   

20.
We present a new method to describe shape change and shape differences in curves, by constructing a deformation function in terms of a wavelet decomposition. Wavelets form an orthonormal basis which allows representations at multiple resolutions. The deformation function is estimated, in a fully Bayesian framework, using a Markov chain Monte Carlo algorithm. This Bayesian formulation incorporates prior information about the wavelets and the deformation function. The flexibility of the MCMC approach allows estimation of complex but clinically important summary statistics, such as curvature in our case, as well as estimates of deformation functions with variance estimates, and allows thorough investigation of the posterior distribution. This work is motivated by multi-disciplinary research involving a large-scale longitudinal study of idiopathic scoliosis in UK children. This paper provides novel statistical tools to study this spinal deformity, from which 5% of UK children suffer. Using the data we consider statistical inference for shape differences between normals, scoliotics and developers of scoliosis, in particular for spinal curvature, and look at longitudinal deformations to describe shape changes with time.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号