Bayesian revision of a prior given prior-data conflict,expert opinion,or a similar insight: a large-deviation approach |
| |
Authors: | David R Bickel |
| |
Institution: | Department of Biochemistry, Microbiology, and Immunology, Department of Mathematics and Statistics, Ottawa Institute of Systems Biology, University of Ottawa, Ottawa, ON, Canada |
| |
Abstract: | Learning from model diagnostics that a prior distribution must be replaced by one that conflicts less with the data raises the question of which prior should instead be used for inference and decision. The same problem arises when a decision maker learns that one or more reliable experts express unexpected beliefs. In both cases, coherence of the solution would be guaranteed by applying Bayes's theorem to a distribution of prior distributions that effectively assigns the initial prior distribution a probability arbitrarily close to 1. The new distribution for inference would then be the distribution of priors conditional on the insight that the prior distribution lies in a closed convex set that does not contain the initial prior. A readily available distribution of priors needed for such conditioning is the law of the empirical distribution of sufficiently large number of independent parameter values drawn from the initial prior. According to the Gibbs conditioning principle from the theory of large deviations, the resulting new prior distribution minimizes the entropy relative to the initial prior. While minimizing relative entropy accommodates the necessity of going beyond the initial prior without departing from it any more than the insight demands, the large-deviation derivation also ensures the advantages of Bayesian coherence. This approach is generalized to uncertain insights by allowing the closed convex set of priors to be random. |
| |
Keywords: | Bayesian model averaging model assessment model checking model criticism posterior predictive check prior predictive check |
|
|