0


图在双曲空间中的神经嵌入

Neural Embeddings of Graphs in Hyperbolic Space
课程网址: http://videolectures.net/kdd2017_chamberlain_neural_embeddings/  
主讲教师: Ben Chamberlain
开课单位: 伦敦帝国理工学院
开课时间: 2017-12-01
课程语种: 英语
中文简介:
神经嵌入在自然语言处理(NLP)中已经取得了巨大的成功。它们提供了封装单词相似性的紧凑表示,并在一系列语言任务中获得了最先进的性能。神经嵌入的成功促使人们对语言以外的领域的应用进行了大量研究。一个这样的领域是图结构数据,其中可以学习顶点的嵌入,以封装顶点相似性并提高包括边缘预测和顶点标记在内的任务的性能。对于NLP和基于图的任务,嵌入都是在高维欧几里得空间中学习的。然而,最近的工作表明,嵌入复杂网络的合适等距空间不是平坦的欧几里得空间,而是负弯曲的双曲空间。我们提出了一个新的概念,利用了这些最近的见解,并提出了在双曲空间中学习图的神经嵌入。我们提供了实验证据,证明将图嵌入其自然几何结构中可以显著提高几个真实世界公共数据集的下游任务性能。
课程简介: Neural embeddings have been used with great success in Natural Language Processing (NLP). They provide compact representations that encapsulate word similarity and attain state-of-the-art performance in a range of linguistic tasks. The success of neural embeddings has prompted significant amounts of research into applications in domains other than language. One such domain is graph-structured data, where embeddings of vertices can be learned that encapsulate vertex similarity and improve performance on tasks including edge prediction and vertex labelling. For both NLP and graph based tasks, embeddings have been learned in high-dimensional Euclidean spaces. However, recent work has shown that the appropriate isometric space for embedding complex networks is not the flat Euclidean space, but negatively curved, hyperbolic space. We present a new concept that exploits these recent insights and propose learning neural embeddings of graphs in hyperbolic space. We provide experimental evidence that embedding graphs in their natural geometry significantly improves performance on downstream tasks for several real-world public datasets.
关 键 词: 双曲空间; 图形嵌入; 神经嵌入
课程来源: 视频讲座网
数据采集: 2023-06-19:chenxin01
最后编审: 2023-06-19:chenxin01
阅读次数: 23