0


利用预训练语言模型进行文本增强时态知识图中的时间间隔预测

Leveraging Pre-trained Language Models for Time Interval Prediction in Text-Enhanced Temporal Knowledge Graphs
课程网址: https://videolectures.net/eswc2024_sezen_islakoglu_knowledge_grap...  
主讲教师: Duygu Sezen Islakoglu
开课单位: 2024年上海世博会
开课时间: 2024-06-14
课程语种: 英语
中文简介:
大多数知识图完成(KGC)方法仅依赖于结构信息,尽管大量公开的知识图包含额外的时间(有效时间间隔)和文本数据(实体描述)。虽然最近的时态KGC方法利用时间信息来增强链接预测,但它们不利用文本描述或支持归纳推理(对训练过程中没有看到的实体的预测)。在这项工作中,我们提出了一个名为TEMT的新框架,该框架利用了预训练语言模型(PLM)在时态KGC中的强大功能。TEMT通过融合事实的文本和时间信息来预测事实的时间间隔。它还通过利用PLM支持归纳推理。为了展示TEMT的强大功能,我们进行了几个实验,包括在传感和感应设置中的时间间隔预测,以及三重分类。实验结果表明,TEMT是有效的。
课程简介: Most knowledge graph completion (KGC) methods rely solely on structural information, even though a large number of publicly available KGs contain additional temporal (validity time intervals) and textual data (entity descriptions). While recent temporal KGC methods utilize time information to enhance link prediction, they do not leverage textual descriptions or support inductive inference (prediction for entities that have not been seen during training). In this work, we propose a novel framework called TEMT that exploits the power of pre-trained language models (PLMs) for temporal KGC. TEMT predicts time intervals of facts by fusing their textual and temporal information. It also supports inductive inference by utilizing PLMs. In order to showcase the power of TEMT, we carry out several experiments including time interval prediction, both in transductive and inductive settings, and triple classi cation. The experimental results demonstrate that TEMT is comp.
关 键 词: 预训练语言模型; 文本增强; 时间间隔预测
课程来源: 视频讲座网
数据采集: 2024-08-05:liyq
最后编审: 2024-09-24:liyy
阅读次数: 18