0


辍学的PAC贝叶斯分析

A PAC-Bayesian Analysis of Dropouts
课程网址: http://videolectures.net/nipsworkshops2013_mcallester_dropouts/  
主讲教师: David McAllester
开课单位: 芝加哥丰田技术研究所
开课时间: 2014-10-06
课程语种: 英语
中文简介:
直观地说,一个对退出扰动具有鲁棒性的神经网络应该具有更好的泛化特性——它应该在新的输入上表现得更好。随机模型摄动是pac -贝叶斯泛化理论的基本概念。本次演讲将简要总结pac -贝叶斯泛化理论,并给出一个简单形式的退出训练的正则化边界作为一个直接的应用。对于包含L2惩罚的模型权值的正则化边界,退出将正则化惩罚减少1-alpha,其中alpha是退出率。然后,该界限表示辍学率和训练损失之间的权衡。虽然这种正则化的界限很有趣,但它可能不是正确的分析。另一种分析包括方差减少——套袋的标准动机。有很好的理由相信,某个一般的PAC-Bayes方差界明显比一般的PAC-Bayes正则化界更紧。不幸的是,方差界是不透明的——它不涉及显式的正则化,很难与正则化边界进行比较。此外,与正则化边界不同,没有明显的方法来设计最小化方差边界的算法。一个引人注目的基于方差的pac -贝叶斯辍学分析仍然是一个开放的问题。
课程简介: Intuitively, a neural network that is robust to dropout perturbations should have better generalization properties - it should perform better on novel inputs. Stochastic model perturbation is the fundamental concept underlying PAC-Bayesian generalization theory. This talk will briefly summarize PAC-Bayesian generalization theory and give a regularization bound for a simple form of dropout training as a straightforward application. For a regularization bound involving an L2 penalty for model weights, dropouts reduce the regularization penalty by a factor of 1-alpha where alpha is the dropout rate. The bound then expresses a trade-off between the dropout rate and the training loss. While this regularization bound in intriguing, it may not be the right analysis. An alternative analysis involves variance reduction - the standard motivation for bagging. There are good reasons to believe that a certain general PAC-Bayes variance bound is significantly tighter than the general PAC-Bayes regularization bound. Unfortunately the variance bound is opaque - it does not involve explicit regularization and is difficult to compare with regularization bounds. Also, unlike regularization bounds, there is no obvious method for designing algorithms that minimize the variance bound. A compelling variance-based PAC-Bayesian analysis of dropouts remains an open problem.
关 键 词: 随机模型; 贝叶斯泛化理论; 最小化方差
课程来源: 视频讲座网
数据采集: 2022-11-08:chenjy
最后编审: 2022-11-08:chenjy
阅读次数: 43