Kullback-Leibler divergence to evaluate posterior sensitivity to different priors for autoregressive time series models |
| |
Authors: | Ayman A Amin |
| |
Institution: | Department of Statistics, Mathematics, and Insurance, Faculty of Commerce, Menoufia University, Menoufia, Egypt |
| |
Abstract: | In this paper we use the Kullback-Leibler divergence to measure the distance between the posteriors of the autoregressive (AR) model coefficients, aiming to evaluate mathematically the sensitivity of the coefficients posterior to different types of priors, i.e. Jeffreys’, g, and natural conjugate priors. In addition, we evaluate the impact of the posteriors distance in Bayesian estimates of mean and variance of the model coefficients by generating a large number of Monte Carlo simulations from the posteriors. Simulation study results show that the coefficients posterior is sensitive to prior distributions, and the posteriors distance has more influence on Bayesian estimates of variance than those of mean of the model coefficients. Same results are obtained from the application to real-world time series datasets. |
| |
Keywords: | Distance of posteriors g-prior Jeffreys’ prior Kullback-Leibler calibration Kullback-Leibler divergence Multivariate t distribution Natural conjugate prior |
|
|