0


大规模贝叶斯线性模型中的凹高斯变分近似推理

Concave Gaussian Variational Approximations for Inference in Large-Scale Bayesian Linear Models
课程网址: http://videolectures.net/aistats2011_challis_gaussian/  
主讲教师: Edward Challis
开课单位: 伦敦大学学院
开课时间: 2011-05-06
课程语种: 英语
中文简介:
在近似贝叶斯推理中形成原则界的两种常用方法是局部变分法和最小库尔-莱伯散度法。对于一类大的模型,我们显式地将这两种方法联系起来,表明局部变分方法等价于弱化形式的库尔巴克-莱布尔高斯近似。这为开发KL最小化的有效方法提供了强大的动力。KL变分高斯边界的一个重要且先前未被证明的性质是,它是对数凹点高斯参数中的一个凹函数。这一观察结果,加上协方差的紧凹参数,使我们能够开发快速的可伸缩优化程序,以获得大规模贝叶斯线性模型中边际似然的下界。
课程简介: Two popular approaches to forming principled bounds in approximate Bayesian inference are local variational methods and minimal Kullback-Leibler divergence methods. For a large class of models, we explicitly relate the two approaches, showing that the local variational method is equivalent to a weakened form of Kullback-Leibler Gaussian approximation. This gives a strong motivation to develop efficient methods for KL minimisation. An important and previously unproven property of the KL variational Gaussian bound is that it is a concave function in the parameters of the Gaussian for log concave sites. This observation, along with compact concave parameterisations of the covariance, enables us to develop fast scalable optimisation procedures to obtain lower bounds on the marginal likelihood in large scale Bayesian linear models.
关 键 词: 贝叶斯线性模型; 凹高斯变分; 近似推理
课程来源: 视频讲座网
最后编审: 2021-02-04:nkq
阅读次数: 39