变分高斯过程Variational Gaussian Process |
|
课程网址: | http://videolectures.net/iclr2016_tran_variational_gaussian/ |
主讲教师: | Dustin Tran |
开课单位: | 哥伦比亚大学 |
开课时间: | 2016-05-27 |
课程语种: | 英语 |
中文简介: | 变分推理是一种强大的近似推理工具,最近它被应用于深度生成模型的表示学习。我们发展了变分高斯过程(VGP),这是一个贝叶斯非参数变分族,它调整其形状以匹配复杂的后验分布。变分高斯过程通过生成潜在输入并通过随机非线性映射对其进行扭曲来生成近似后验样本;在推理过程中学习随机映射上的分布,使变换后的输出能够适应变化的复杂性。我们证明了变分高斯过程的一个普遍逼近定理,证明了它学习任何模型的代表性。对于推理,我们提出了一个受自动编码器启发的变分目标,并在广泛的一类模型上执行黑匣子推理。变分高斯过程在无监督学习中获得了新的最先进的结果,推断出了诸如深度潜在高斯模型和最近提出的DRAW之类的模型。 |
课程简介: | Variational inference is a powerful tool for approximate inference, and it has been recently applied for representation learning with deep generative models. We develop the variational Gaussian process (VGP), a Bayesian nonparametric variational family, which adapts its shape to match complex posterior distributions. The VGP generates approximate posterior samples by generating latent inputs and warping them through random non-linear mappings; the distribution over random mappings is learned during inference, enabling the transformed outputs to adapt to varying complexity. We prove a universal approximation theorem for the VGP, demonstrating its representative power for learning any model. For inference we present a variational objective inspired by auto-encoders and perform black box inference over a wide class of models. The VGP achieves new state-of-the-art results for unsupervised learning, inferring models such as the deep latent Gaussian model and the recently proposed DRAW. |
关 键 词: | 变分推理; 深度模型; 后验分布 |
课程来源: | 视频讲座网 |
数据采集: | 2023-06-07:chenxin01 |
最后编审: | 2023-06-07:chenxin01 |
阅读次数: | 71 |