首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Bayesian Representation of Stochastic Processes under Learning: de Finetti Revisited
Authors:Matthew O Jackson  Ehud Kalai  Rann Smorodinsky
Abstract:A probability distribution governing the evolution of a stochastic process has infinitely many Bayesian representations of the form μ=∫μdλ(θ). Among these, a natural representation is one whose components ( μ's) are ‘learnable’ (one can approximate μ by conditioning μ on observation of the process) and ‘sufficient for prediction’ (μ's predictions are not aided by conditioning on observation of the process). We show the existence and uniqueness of such a representation under a suitable asymptotic mixing condition on the process. This representation can be obtained by conditioning on the tail-field of the process, and any learnable representation that is sufficient for prediction is asymptotically like the tail-field representation. This result is related to the celebrated de Finetti theorem, but with exchangeability weakened to an asymptotic mixing condition, and with his conclusion of a decomposition into i.i.d. component distributions weakened to components that are learnable and sufficient for prediction.
Keywords:Bayesian  learning  stochastic processes  
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号