0


贝叶斯推理中的追赶现象

The Catch-Up Phenomenon in Bayesian Inference
课程网址: http://videolectures.net/uai08_grunwald_cup/  
主讲教师: Peter Grünwald
开课单位: 数学与计算机科学中心
开课时间: 2008-07-30
课程语种: 英语
中文简介:
标准贝叶斯模型选择/平均有时学习速度太慢:存在其他学习方法,这些方法可以根据较少的数据得出更好的预测。我们对这种“追赶”现象进行了新颖的分析。基于此分析,我们提出了切换方法,这是对贝叶斯模型平均的修改,它永远不会比慢学习,但有时会比贝叶斯学习得快得多。该方法与COLT文献中开发的专家跟踪算法有关,并且时间复杂度可与贝叶斯媲美。切换方法解决了长期以来关于统计问题的争论,这被称为AIC BIC困境:模型选择/平均方法(如BIC,Bayes和MDL)是一致的(它们最终会推断出正确的模型),但是,当用于预测时,比率为改善的预测可能不是最佳的。诸如AIC和遗漏验证方法之类的方法虽然不一致,但通常以最佳速率收敛。我们的方法是第一个可证明同时实现两者的方法。非参数密度估计的实验证实,这些大样本理论结果在小样本中也同样适用。
课程简介: Standard Bayesian model selection/averaging sometimes learn too slowly: there exist other learning methods that lead to better predictions based on less data. We give a novel analysis of this "catch-up" phenomenon. Based on this analysis, we propose the switching method, a modification of Bayesian model averaging that never learns slower, but sometimes learns much faster than Bayes. The method is related to expert-tracking algorithms developed in the COLT literature, and has time complexity comparable to Bayes. The switching method resolves a long-standing debate in statistics, known as the AIC-BIC dilemma: model selection/averaging methods like BIC, Bayes, and MDL are consistent (they eventually infer the correct model) but, when used for prediction, the rate at which predictions improve can be suboptimal. Methods like AIC and leave-one-out cross-validation are inconsistent but typically converge at the optimal rate. Our method is the first that provably achieves both. Experiments with nonparametric density estimation confirm that these large-sample theoretical results also hold in practice in small samples.
关 键 词: 贝叶斯模型; 追赶现象; 非参数密度
课程来源: 视频讲座网
最后编审: 2020-07-29:yumf
阅读次数: 68