0


辅助变信息最大化降维

Auxillary Variational Information Maximization for Dimensionality Reduction
课程网址: http://videolectures.net/slsfs05_barber_avimd/  
主讲教师: David Barber
开课单位: 伦敦大学学院
开课时间: 2007-02-25
课程语种: 英语
中文简介:
互信息 (mi) 是一种长期研究的信息信息 (mi), 并试图将其应用于特征提取和随机编码。然而, 在一般情况下, mi 是一个很难计算, 和大多数以前的研究重新定义的标准, 在近似的形式。最近, 我们描述了一个简单的下界在 mi [2] 上的属性关系, 并讨论了它与一些流行的维度约简技术的联系。在这里, 我们在 mi 上引入了一个更丰富的辅助变分边界家族, 它概括了我们以前的近似值。然后, 我们的具体重点是应用在不可约化高斯噪声存在的情况下提取信息性低维投影的约束。我们表明, 我们的方法产生了明显更严格的边界 mi 相比, 如高斯近似 [7]。我们还表明, 多项式辅助空间的学习投影可以促进从嘈杂的低维表示重建源。
课程简介: Mutual Information (MI) is a long studied measure of in- formation content, and many attempts to apply it to feature extraction and stochastic coding have been made. However, in general MI is com- putationally intractable to compute, and most previous studies redefine the criterion in forms of approximations. Recently we described proper- ties of a simple lower bound on MI [2], and discussed its links to some of the popular dimensionality reduction techniques. Here we introduce a richer family of the auxiliary variational bounds on MI, which gener- alize our previous approximations. Our specific focus then is on apply- ing the bound to extracting informative lower-dimensional projections in the presence of irreducible Gaussian noise. We show that our method produces significantly tighter bounds on MI compared with the as-if Gaussian approximation [7]. We also show that learning projections to multinomial auxiliary spaces may facilitate reconstructions of the sources from noisy lower-dimensional representations.
关 键 词: 互信息; 降维技术; 具体聚焦
课程来源: 视频讲座网
最后编审: 2020-06-25:liush
阅读次数: 47