rkhs嵌入方法的贝叶斯解释Bayesian Interpretations of RKHS Embedding Methods |
|
课程网址: | http://videolectures.net/nipsworkshops2012_duvenaud_bayesian/ |
主讲教师: | David Kristjanson Duvenaud |
开课单位: | 多伦多大学 |
开课时间: | 2013-01-16 |
课程语种: | 英语 |
中文简介: | 我们在高斯过程先验下给出了平均嵌入作为期望的简单解释。诸如内核两个样本测试,Hilbert Schmidt独立性标准和内核放牧等方法都基于平均嵌入之间的距离,也称为最大平均差异(MMD)。这种贝叶斯解释允许推导出最优的羊群权重,原则性的核学习方法,并阐明了基于MMD的方法在实践中工作所必需的假设。在另一个方向上,MMD解释给出了贝叶斯估计量误差的紧密,闭合形式界限。 |
课程简介: | We give a simple interpretation of mean embeddings as expectations under a Gaussian process prior. Methods such as kernel two-sample tests, the Hilbert-Schmidt Independence Criterion, and kernel herding are all based on distances between mean embeddings, also known as the Maximum Mean Discrepancy (MMD). This Bayesian interpretation allows a derivation of optimal herding weights, principled methods of kernel learning, and sheds light on the assumptions necessary for MMD-based methods to work in practice. In the other direction, the MMD interpretation gives tight, closed-form bounds on the error of Bayesian estimators. |
关 键 词: | 高斯过程; 平均嵌入; 贝叶斯解释 |
课程来源: | 视频讲座网 |
最后编审: | 2019-09-08:lxf |
阅读次数: | 75 |