0


变分推理和稀疏线性模型的实验设计

Variational Inference and Experimental Design for Sparse Linear Models
课程网址: http://videolectures.net/sip08_nickisch_viaed/  
主讲教师: Hannes Nickisch, Matthias W. Seeger
开课单位: 高等洛桑联邦理工学院
开课时间: 2008-12-18
课程语种: 英语
中文简介:
稀疏性是现代统计学中的一个基本概念,并且通常是目前唯一可用于解决新型学习应用的一般原则,其中包含比观察更多的变量。尽管理论上的理解和稀疏点估计的算法最近取得了进展,但协方差估计或最优数据采集等高阶问题很少得到有利于稀疏性模型的解决,并且几乎没有可扩展算法。我们为稀疏线性模型提供了近似贝叶斯推理算法,可以使用数十万个变量。我们的方法采用凸松弛进行变分推理,并在连续贝叶斯推断中解决了一个开放问题:对于一类包括拉普拉斯和伯努利势的超高斯势,高斯下界弛豫是凸的。我们的算法简化为用于稀疏估计方法的相同计算原语,但也需要高斯边际方差估计。我们展示了如何使用数值计算的Lanczos算法来计算后者。我们对贝叶斯实验设计感兴趣,这是一个优化测量架构的强大框架。我们已将我们的框架应用于磁共振成像设计和重建的问题。
课程简介: Sparsity is a fundamental concept in modern statistics, and often the only general principle available at the moment to address novel learning ap- plications with many more variables than observations. Despite the recent advances of the theoretical understanding and the algorithmics of sparse point estimation, higher-order problems such as covariance estimation or optimal data acquisition are seldomly addressed for sparsity-favouring mod- els, and there are virtually no scalable algorithms. We provide an approximate Bayesian inference algorithm for sparse lin- ear models, that can be used with hundred thousands of variables. Our method employs a convex relaxation to variational inference and settles an open question in continuous Bayesian inference: The Gaussian lower bound relaxation is convex for a class of super-Gaussian potentials including the Laplace and Bernoulli potentials. Our algorithm reduces to the same computational primitives used for sparse estimation methods, but requires Gaussian marginal variance esti- mation as well. We show how the Lanczos algorithm from numerical math- ematics can be employed to compute the latter. We are interested in Bayesian experimental design, a powerful framework for optimizing measurement architectures. We have applied our framework to problems of magnetic resonance imaging design and reconstruction.
关 键 词: 计算机科学; 机器学习; 统计学习
课程来源: 视频讲座网
最后编审: 2020-06-29:yumf
阅读次数: 42