首页机械学
0


集体矩阵分解的关系学习

Relational Learning as Collective Matrix Factorization
课程网址: http://videolectures.net/cmulls08_singh_rlm/  
主讲教师: Ajit Singh
开课单位: 卡内基梅隆大学
开课时间: 2008-02-14
课程语种: 英语
中文简介:
我们提出了矩阵分解模型的统一观点,包括奇异值分解、非负矩阵分解、概率潜在语义索引,以及将这些模型推广到指数族和非正则布列格曼发散。可以将关系数据建模为一组矩阵,其中每个矩阵表示两个实体类型之间关系的值。关系数据不是一个单一的矩阵,而是一组具有共享维度和绑定的低阶表示的矩阵。我们的示例域是增强的协作过滤,其中用户评级和项目的侧边信息都可用。为了预测关系的值,我们将Bregman矩阵因式分解扩展到一组相关矩阵。利用交替极小化方案,证明了牛顿步的存在性。还介绍了随机二阶方法在大型矩阵中的应用。
课程简介: We present a unified view of matrix factorization models, including singular value decompositions, non-negative matrix factorization, probabilistic latent semantic indexing, and generalizations of these models to exponential families and non-regular Bregman divergences. One can model relational data as a set of matrices, where each matrix represents the value of a relation between two entity-types. Instead of a single matrix, relational data is represented as a set of matrices with shared dimensions and tied low-rank representation. Our example domain is augmented collaborative filtering, where both user ratings and side information about items are available. To predict the value of a relation, we extend Bregman matrix factorization to a set of related matrices. Using an alternating minimization scheme, we show the existence of a practical Newton step. The use of stochastic second-order methods for large matrices is also covered.
关 键 词: 机械学习; 矩阵分解模型; 数学建模
课程来源: 视频讲座网
最后编审: 2020-06-22:chenxin
阅读次数: 79