0


矩阵分解模型的统一视图

A Unified View of Matrix Factorization Models
课程网址: http://videolectures.net/ecmlpkdd08_singh_auvo/  
主讲教师: Ajit Singh; Geoffrey J. Gordon
开课单位: 卡内基梅隆大学
开课时间: 2008-10-10
课程语种: 英语
中文简介:
我们提出了一个统一的矩阵分解的观点,框架之间的差异流行的方法,如NMF,加权SVD,E-PCA,MMMF,PLSI,PLSI phits,Bregman联合聚类,以及许多其他,在少量的建模选择。这些方法中的许多都可以被看作是最小化广义布列格曼散度,并且我们证明(i)一个简单的交替投影算法可以应用于我们统一观点中的几乎任何模型;(i i)每个投影的黑森具有特殊的结构,使得牛顿投影可行,即使存在相等的常数。对允许矩阵共聚类的因子进行raint;和(iii)交替投影可以推广为同时对一组共享维度的矩阵进行因子分析。这些观察结果立即为上述因子分解方法生成新的优化算法,并建议对这些方法进行新的概括,例如合并行/列偏差,添加或放宽集群约束。
课程简介: We present a unified view of matrix factorization that frames the differences among popular methods, such as NMF, Weighted SVD, E-PCA, MMMF, pLSI, pLSI-pHITS, Bregman co-clustering, and many others, in terms of a small number of modeling choices. Many of these approaches can be viewed as minimizing a generalized Bregman divergence, and we show that (i) a straightforward alternating projection algorithm can be applied to almost any model in our unified view; (ii) the Hessian for each projection has special structure that makes a Newton projection feasible, even when there are equality constraints on the factors, which allows for matrix co-clustering; and (iii) alternating projections can be generalized to simultaneously factor a set of matrices that share dimensions. These observations immediately yield new optimization algorithms for the above factorization methods, and suggest novel generalizations of these methods such as incorporating row/column biases, and adding or relaxing clustering constraints.
关 键 词: 计算机科学; 机器学习; 矩阵
课程来源: 视频讲座网
最后编审: 2020-06-22:chenxin
阅读次数: 37