首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到16条相似文献,搜索用时 15 毫秒
1.
We extend the Bayesian Model Averaging (BMA) framework to dynamic panel data models with endogenous regressors using a Limited Information Bayesian Model Averaging (LIBMA) methodology. Monte Carlo simulations confirm the asymptotic performance of our methodology both in BMA and selection, with high posterior inclusion probabilities for all relevant regressors, and parameter estimates very close to their true values. In addition, we illustrate the use of LIBMA by estimating a dynamic gravity model for bilateral trade. Once model uncertainty, dynamics, and endogeneity are accounted for, we find several factors that are robustly correlated with bilateral trade. We also find that applying methodologies that do not account for either dynamics or endogeneity (or both) results in different sets of robust determinants.  相似文献   

2.
In this paper we obtain the Bayes forecasts for the future observations on the dependent variable in the linear regression model when the regression coefficients have an Edgeworth series prior distribution. Furthermore, we consider the effect of departure from normality of the prior distribution of regression coefficients on the Bayes forecasts.  相似文献   

3.
This paper explores the study on mixture of a class of probability density functions under type-I censoring scheme. In this paper, we mold a heterogeneous population by means of a two-component mixture of the class of probability density functions. The parameters of the class of mixture density functions are estimated and compared using the Bayes estimates under the squared-error and precautionary loss functions. A censored mixture dataset is simulated by probabilistic mixing for the computational purpose considering particular case of the Maxwell distribution. Closed-form expressions for the Bayes estimators along with their posterior risks are derived for censored as well as complete samples. Some stimulating comparisons and properties of the estimates are presented here. A factual dataset has also been for illustration.  相似文献   

4.
The purpose of the present investigation 1s to observe the effect of departure from normahty of the prior distribution of regresslon parameters on the Bayman analysis of a h e a r regresslon model Assuming an Edgeworth serles prior distribution for the regresslon coefficients and gamma prior for the disturbances precision, the expressions for the posterlor distribution, posterlor mean and Bayes risk under a quadratic loss function are obtalned The results of a numerical evaluation are also analyzed  相似文献   

5.
6.
This article considers explicit and detailed theoretical and empirical Bayesian analysis of the well-known Poisson regression model for count data with unobserved individual effects based on the lognormal, rather than the popular negative binomial distribution. Although the negative binomial distribution leads to analytical expressions for the likelihood function, a Poisson-lognormal model is closer to the concept of regression with normally distributed innovations, and accounts for excess zeros as well. Such models have been considered widely in the literature (Winkelmann, 2008 Winkelmann , R. ( 2008 ). Econometric Analysis of Count Data. , 5th ed. Berlin : Springer . [Google Scholar]). The article also provides the necessary theoretical results regarding the posterior distribution of the model. Given that the likelihood function involves integrals with respect to the latent variables, numerical methods organized around Gibbs sampling with data augmentation are proposed for likelihood analysis of the model. The methods are applied to the patent-R&D relationship of 70 US pharmaceutical and biomedical companies, and it is found that it performs better than Poisson regression or negative binomial regression models.  相似文献   

7.
In this paper the generalized compound Rayleigh model, exhibiting flexible hazard rate, is high¬lighted. This makes it attractive for modelling survival times of patients showing characteristics of a random hazard rate. The Bayes estimators are derived for the parameters of this model and some survival time parameters from a right censored sample. This is done with respect to conjugate and discrete priors on the parameters of this model, under the squared error loss function, Varian's asymmetric linear-exponential (linex) loss function and a weighted linex loss function. The future survival time of a patient is estimated under these loss functions. A Monte Carlo simu¬lation procedure is used where closed form expressions of the estimators cannot be obtained. An example illustrates the proposed estimators for this model.  相似文献   

8.
Highly skewed and non-negative data can often be modeled by the delta-lognormal distribution in fisheries research. However, the coverage probabilities of extant interval estimation procedures are less satisfactory in small sample sizes and highly skewed data. We propose a heuristic method of estimating confidence intervals for the mean of the delta-lognormal distribution. This heuristic method is an estimation based on asymptotic generalized pivotal quantity to construct generalized confidence interval for the mean of the delta-lognormal distribution. Simulation results show that the proposed interval estimation procedure yields satisfactory coverage probabilities, expected interval lengths and reasonable relative biases. Finally, the proposed method is employed in red cod densities data for a demonstration.  相似文献   

9.
This paper shows that a minimax Bayes rule and shrinkage estimators can be effectively applied to portfolio selection under the Bayesian approach. Specifically, it is shown that the portfolio selection problem can result in a statistical decision problem in some situations. Following that, we present a method for solving a problem involved in portfolio selection under the Bayesian approach.  相似文献   

10.
11.
12.
13.
Summary.  The paper develops a data augmentation method to estimate the distribution function of a variable, which is partially observed, under a non-ignorable missing data mechanism, and where surrogate data are available. An application to the estimation of hourly pay distributions using UK Labour Force Survey data provides the main motivation. In addition to considering a standard parametric data augmentation method, we consider the use of hot deck imputation methods as part of the data augmentation procedure to improve the robustness of the method. The method proposed is compared with standard methods that are based on an ignorable missing data mechanism, both in a simulation study and in the Labour Force Survey application. The focus is on reducing bias in point estimation, but variance estimation using multiple imputation is also considered briefly.  相似文献   

14.
Abstract

For the restricted parameter space (0,1), we propose Zhang’s loss function which satisfies all the 7 properties for a good loss function on (0,1). We then calculate the Bayes rule (estimator), the posterior expectation, the integrated risk, and the Bayes risk of the parameter in (0,1) under Zhang’s loss function. We also calculate the usual Bayes estimator under the squared error loss function, and the Bayes estimator has been proved to underestimate the Bayes estimator under Zhang’s loss function. Finally, the numerical simulations and a real data example of some monthly magazine exposure data exemplify our theoretical studies of two size relationships about the Bayes estimators and the Posterior Expected Zhang’s Losses (PEZLs).  相似文献   

15.
Precarious employment is a serious social problem, especially in those countries, such as Italy, where there are limited benefits from social security. We investigate this phenomenon by analysing the initial part of the career of employees starting with unstable contracts for a panel of Italian workers. Our aim is to estimate the probability of getting a stable job and to detect factors influencing both this probability and the duration of precariousness. To answer these questions, we use an ad hoc mixture cure rate model in a Bayesian framework.  相似文献   

16.
Summary.  In a precision farming context, differentiated management decisions regarding fertilization, application of lime and other cultivation activities may require the subdivision of the field into homogeneous regions with respect to the soil variables of main agronomic significance. The paper develops an approach that is aimed at delineating homogeneous regions on the basis of measurements of a categorical and quantitative nature, namely soil type and resistivity measurements at different soil layers. We propose a Bayesian multivariate spatial model and embed it in a Markov chain Monte Carlo inference scheme. Implementation is discussed using real data from a 15-ha field. Although applied to soil data, this model could be relevant in areas of spatial modelling as diverse as epidemiology, ecology or meteorology.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号